Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Android Central
Android Central
Technology
Michael L Hicks

The Meta Ray-Ban smart glasses just got a big AI update, but you should act fast to get access

A close-up of the Ray-Ban Meta smart glasses' logos.

What you need to know

  • The Ray-Ban Meta smart glasses can now analyze a photo to tell you what you're looking at.
  • The new multimodal AI features also include Bing search results to your queries.
  • These features are only available via an early-access beta in the U.S., but will roll out globally next year. 

At Meta Connect 2023, CEO Mark Zuckerberg promised that the company's new Ray-Ban Meta smart glasses would be able to analyze your surroundings, similar to Google Lens, starting in 2024. Now it turns out that U.S. smart glasses owners can get a head start if they're quick.

On Instagram, Zuckerberg showed off the Ray-Ban Meta smart glasses' new multimodal AI tech in several use cases. He asked the Meta AI what to wear with a striped shirt, what type of fruit something was, and to translate a Spanish meme into English. 

Specifically, you'll say "Hey Meta," wait for the acknowledging ding that your glasses are listening, and then say "Look and tell me [blank]" to get information about it. 

In each case, the smart glasses have to take a photo of a thing to analyze it, most likely because it's not the Ray-Bans themselves analyzing the data: it has to be sent to the cloud through your smartphone, where some combination of Meta's AI tech and the Bing search engine will answer your questions.

In theory, though, you probably won't mind having a photo of whatever it was that confounded you, so this shouldn't be a deal-breaker for Ray-Ban Meta owners. 

Meta's blog announcement post explained that the Meta AI and Bing will answer questions about "sports scores or information on local landmarks, restaurants, stocks and more." It also emphasized that "these multimodal AI features may not always get it right," as they use beta testers to spot bugs. 

If you own the Meta Ray-Ban smart glasses and want to test the feature for yourself, you'll need to enable Early Access mode in the Meta View app. You'll find the simple instructions at this link

Meta CTO Andrew Bosworth told Instagram users that this Early Access is "limited to a small number of people who opt in," which could mean Meta could cut off access if enough people sign up for the test. The feature fully launches next year sometime, but you should sign up now if you don't want to wait.

We praised the Ray-Ban Meta smart glasses in our review, especially for its high-res portrait camera and solid audio. One negative we noted was that the "Hey Meta" command could only complete simple commands, struggling with more complicated queries. 

That's why this news seems extremely promising, as a way to close up one of its main weak points and help it surpass most of the other smart glasses on the market.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.