Google's AI smart glasses are the most exciting thing I saw at I/O — here's what you need to know
During Google I/O 2024, the company sneakily showed off a pair of glasses alongside Project Astra, a fresh approach to on-demand AI assistance for navigating whatever tasks you encounter in a given day. Although Google didn't go so far as to announce the revival of Google Glass, the teaser tells us that such a device is probably in the works.
Gadgets packed with AI agents are having a moment, and not in a good way. Back-to-back letdowns from the wearable Humane AI Pin and pocketable Rabbit R1 box revealed that design plays a big role in the practicality of AI hardware in everyday life. Projections on your hand are fickle, and no one wants to carry around another device the same size as their smartphone.
The only form factor that makes sense right now are those discreetly and comfortably worn on your face, as we've seen from the growing success of the Ray-Ban Meta Smart Glasses. Even before the massive Meta AI update that launched recently, the glasses become one of the few products I take with me everywhere since they're multipurpose. Ray-Ban Meta Smart Glasses are headphones, cameras, and an AI butler all in one — and that's precisely why they're awesome, and why Google should follow Meta's lead.
Though there are some alternative ChatGPT-loaded smart glasses on the market, a Google version that adapts the impressive Gemini AI abilities for things like object identification and other camera-based prompts would be major.
What can Project Astra do with smart glasses?
Google's demo provided a handful of examples of what Project Astra can achieve just by analyzing what a camera sees in real time, replicating the core of how Gemini is transforming Google Search results with a priority on relevance.
The glasses could help you with homework questions in real-time while looking at equations, identify landmarks, answer questions about points of interest and more.
Project Astra is coming to phones first in the form of Gemini Live, but I'd bet that a pair of glasses with Gemini Live isn't far off. The Ray-Ban Meta Smart Glasses cleared an essential first step: proving that eye-level cameras and ear-level microphones are the most natural way to interact with any kind of "look at tell me" AI tool.
If given the option, my gut tells me smart glasses adopters would be more inclined to choose Google than Meta as the maker just based on the each company's history with providing information. Wouldn't you opt for the product from the company who's name is literally used as a verb for search?
There's not much else to say about Google smart glasses at this point in time, and certainly not anything beyond speculation. That said, I would be nervous if I were Meta.