The computing platform of the future is slowly coming together. Smart glasses that support augmented reality (AR) features and artificial intelligence (AI) will eventually replace smartphones. I’ve often said that Apple is likely working on such a product, even if it is many years away. The Vision Pro spatial computer is a big step forward in that direction.
With the advent of generative AI, we’re starting to see more companies launch AI products, and Apple can’t be too far behind. I’ve long said that the Vision Pro needs built-in AI, and that was before the Ai Pin, Google Gemini, or Meta’s new AI-ready Ray-Ban smart glasses came along. All these developments make me dream of Apple’s future iPhone replacement.
Humane debuted the Ai Pin a few weeks ago, an AI-centric wearable that lacks a display. It’s a smartphone replacement that puts AI at the center of everything. AI can hear what you say, and it can see what you see. It can take pictures and videos and answer questions like ChatGPT does.
The only problem with it is that it lacks a display. I said at the time that AR glasses with such AI features would better serve the purpose of replacing the smartphone. But Humane’s Ai Pin vision doesn’t include that.
Google Gemini came out swinging last week with an incredible demo. Google showed us an AI assistant that conducted a conversation with the user and reacted in real time to what it could see. It was all fake, as we’ve come to learn. I said at the time that such technology is surely coming in the future, though Gemini can’t do any of that.
It turns out that Meta wants to try its hand at having AI hear and see what you hear and see. And it’s all starting in beta on the Ray-Ban smart glasses.
Meta’s Mark Zuckerberg and Andrew Bosworth have posted video demos of the AI features in action. They’re similar to Google Gemini, only not as fake.
In the clips that the Meta CEO posted, Zuckerberg is asking Meta AI to suggest pants that would match a shirt he’s holding. And the AI does the job. In a different clip, he’s using the glasses to translate text from a printed meme on a wall.
Separately, Meta’s CTO asked the Ray-Ban smart glasses to describe a wall sculpture, which isn’t what you’d necessarily think of when demoing AI. But Meta AI saw the art piece and detailed it, proving the multimodal features of the software work.
That’s what Meta is testing right now. It wants its AI to be able to hold conversations about what’s happening around a user, kind of like Google implied with Gemini in its demo, and similar to Humane’s Ai Pin.
What’s missing here is the AR experience that should join the AI one. And that’s where I hope Apple’s ingenuity comes in. I’m expecting Apple to lead the way when it comes to making AR smart glasses with AI capabilities like it’s doing with the Vision Pro.
Considering these developments, Apple has no choice but to deliver such a device. Otherwise, it risks someone else developing great AR smart glasses that people will want to wear. And I’m certain that combining AR glasses with AI will “kill” the smartphone. I’m talking about the current form factor and the way we use iPhones and Android handsets, because the smartphone isn’t going to disappear.
Until then, Meta is hosting a limited beta in the US, where some Ray-Ban smart glasses users will be able to take Meta AI for a ride. In addition to these multimodal queries, Meta AI will also answer questions with the help of Bing. Read more about Meta’s AI plans for the Ray-Ban glasses at this link.