
If you haven't yet been sold on the idea of AI glasses, the short-sighted Mark Zuckerberg thinks you will be left behind if you don't get yourself a pair.
In Meta's Q2 earnings call (via TechCrunch), CEO and founder Mark Zuckerberg presented the figures for this quarter, alongside announcing where the profit will be going to further grow Meta. Unsurprisingly, that means a lot of talk of AI. But one of the most interesting aspects of this call is when Zuckerberg turns his attention to AI glasses, something the company has been championing for some time.
"I wear contact lenses. I feel like if I didn't have my vision corrected, I'd be sort of at a cognitive disadvantage going through the world. And I think in the future, if you don't have glasses that have AI or some way to interact with AI, I think you're kind of similarly probably be at a pretty significant cognitive disadvantage."
Zuckerberg goes on to talk about Meta's work on this specific form factor and its focus on extra types of glasses, like those with style in mind. He says this is one of the main focuses of Reality Labs over the last "five to ten years".
Reality Labs takes the focus here and is part of Zuckerberg's Metaverse-y future plans for the company. The intentional name drop in this call could be tied to the fact that it has been a perennial loss maker, operating with a loss of $4.53 billion in Q2 this year, with it reporting losses of $4.488 billion in the same period last year. Meta still pulled in almost $20 billion in profit, but Reality Labs is a rather significant investment right now, and it needs to be regularly sold to investors.

Zuckerberg says: "Because we've been investing in this, I think we're just several years ahead on building out glasses. And I think that's something we're excited to keep on investing in heavily because I think it's going to be a really important part of the future."
Right now, Meta's AI glasses can connect to the AI app, where they can access an assistant, share photos, and do some browsing. Sure, Zuckerberg is making an argument about the future of the glasses, and therefore the future of AI, but at the moment there's not a whole lot that would make you cognitively advantaged over someone not wearing a set of their smart Ray-Bans. The suggestion of some future cognitive advantage inevitably feels nebulous and hard to quantify without anything to pin them to today, but no-one's ever going to let that get in the way of an investor-facing marketing pitch.
Unless it can be significantly easier to use than picking up your phone, I don't think I'm quite sold on its potential. Maybe agentic AI could help cut down on the time you spend actually interacting with it via your glasses, but it seems like any AI advancements made here will also be made on other devices.
Despite my scepticism, our Hope does at least seem to quite like her Ray-Ban Meta glasses. As well as noting, "I kinda like them shut up", she told me: "I really like them for taking photos on holiday (not of people) without having to stop the flow of exploring." She also sometimes uses them for PoV videos of painting.
Though it's not the AI part of these glasses she likes, the form factor does seem to be pretty solid. And that seems like the true selling point of Meta's glasses so far; their convenience, less so their ability to improve your cognition. We'll have to wait and see if all this AI investment pays off in the smarts department down the line.