Forget mixed reality! Meta’s Ray-Ban Smart Glasses now offer a shared reality (with AI)

 Ray-Ban Meta smart glasses close up.
Ray-Ban Meta smart glasses close up.

The Meta Ray-Ban Smart Glasses, equipped with a 12-megapixel camera for unique first-person captures, might seem like nothing more than an influencer focused wearable. However, the Facebook firm are now in the process of rolling out an early-access preview of one of its most interesting features.

While some smart glasses venture into the realms of mixed and augmented reality, Meta’s Ray-Ban Smart Glasses don’t feature a built-in display. Instead, Meta’s smart sunnies make use of the onboard camera to offer something else entirely: a shared reality with Meta AI.

UV blocking, AI unlocking

The update introduces multimodal AI features to Meta’s smart glasses. A multimodal AI can process information in many forms. Most of the AI tools we are familiar with primarily make use of text prompts to interpret how to respond. A multimodal AI can make use of multiple types of prompt to deliver more accurate and contextualized results.

In the case of the Meta Ray-Ban smart glasses, users will soon be able to pass an image of what they’re currently looking at to Meta AI to allow it to truly understand the context around what you’re asking.

For example, you can look at an item of clothing and ask Meta AI for help in what to wear with it. Using the onboard camera, the AI can identify the item of clothing and then make accurate suggestions on how to style your ‘fit. You can also ask Meta AI to identify an object, tell you more about a location, and even identify landmarks.

Outlook

If you want to check out the latest features of Meta’s smart glasses, then you’ll need to be based in the United States and opt in to the early access program (found within the Meta View app for Android and iOS).

As a wider update, Meta AI will also have access to real-time information powered by Microsoft’s Bing. This means you’ll now have access to up-to-date information on world and web events when prompting Meta’s AI. Once again, this feature will come to U.S. users first, but you won’t need to opt in to receive this update which is rolling out now.