Google makes Tony Stark's EDITH glasses a reality: AI features, demo video, and more

Google
Google has showcased its Project Astra, which also reveals new smart glasses that has functionalities similar to EDITH glasses (Image via Google/YouTube)

Google has just concluded its I/O 2024, and one of the highlights of the event was the reveal of Project Astra, which provides a contextual view of the surroundings near you, similar to EDITH glasses used by Tony Stark and Spider-Man in the Marvel Spider-Man movies. It combines your camera viewfinder with a responsive chatbot to answer all types of questions you have regarding the environment you are in at the moment.

However, the company’s demo also showed Project Astra running on a pair of smart glasses, raising many eyebrows. This article discusses all the AI features of Project Astra, its use cases, and its other details.


How is Project Astra similar to EDITH glasses?

Google's new Project Astra has now been demonstrated and it has features similar to EDITH glasses used by Tony Stark and Peter Parker (Image via Augmented Reality)
Google's new Project Astra has now been demonstrated and it has features similar to EDITH glasses used by Tony Stark and Peter Parker (Image via Augmented Reality)

Google has called Astra "the future of AI assistants." It acts as a universal AI tool that can "help in everyday life." The goal is to be quick and conversational with the user with little lag. This feature might seem similar to Meta's Ray-Ban glasses. However, Google wants to eliminate the lags that many Ray-Ban users face while uploading video files.

Google also showcases a video of Gemini AI in Project Astra identifying and explaining parts of code on a monitor. It explains to the user the neighborhood they were in based on the view out the window, similar to the functionalities of EDITH glasses.


Project Astra's demo video showcases features like EDITH glasses

youtube-cover

In the demo, a woman is heard asking where she misplaced her glasses running Project Astra. The chatbot was able to quickly respond to her queries about what she saw, assisting her with technical diagrams.

Surprisingly, the glasses also had cameras onboard to act as a visual interface for users, similar to EDITH glasses. The glasses then scanned the wearer's surroundings to see things like a companion. The person in the video then asked Astra various questions, and a waveform was depicted to indicate that it was listening. Astra answered all the questions correctly in real-time.

Onscreen captions of the user's questions and their answers could be seen and read aloud to the wearer. By the looks of it, Google seems to have included a microphone and speaker with the glasses. However, the rest of its specifications are still unknown.


We could only see a brief visual of the wearable in the video. The glasses had a simple black frame and didn't look anything like Google Glass, which was launched over a decade ago. Google has also assured that these AI capabilities are coming later this year with Gemini AI. However, there is no mention of the release date of the smart glasses.


Check out other similar articles:

App download animated image Get the free App now