A report last November said Apple was studying a new product category, AI smart glasses like the Ray-Ban Meta smart glasses that have been quite popular with users. The initiative was related to Apple’s intention to bring Apple Intelligence to other wearable devices. The company is also considering placing cameras on future Apple Watch and AirPods versions.
The AI of the future will need a real-time video and audio feed to provide help to users in real-time and take into account their context. That’s where such devices would come in handy, especially a pair of head-worn glasses that could handle Apple Intelligence and Visual Intelligence tasks.
Apple smart glasses aren’t to be confused with the sophisticated AR glasses that might replace the iPhone at some point in the future. Those glasses will take longer to make, as the tech required to manufacture such devices isn’t mature enough. Those glasses will also feature Apple Intelligence capabilities, but they’ll also overlay AR content on top of reality.
A new report says that Apple is apparently going forward with smart glasses like Meta’s, but the project might take several years to deploy.
Bloomberg’s Mark Gurman detailed Apple’s purported “Project Atlas” a few months ago. Apple reportedly enlisted the help of its own employees to test products like the Ray-Ban Meta smart glasses. The reporter also mentioned Apple’s plans to install cameras on future Apple Watch and AirPods modes in other stories. More recently, he also said the AR glasses are Tim Cook’s top priority at Apple.
“Testing and developing products that all can come to love is very important to what we do at Apple,” said a purported internal Apple email from Apple’s Product Systems Quality team that Gurman saw last fall. “This is why we are looking for participants to join us in an upcoming user study with current market smart glasses.”
Gurman brought up the Apple smart glasses in a new Power On newsletter over the weekend. From the looks of it, Apple has decided to manufacture the product, which indicates the previously reported test may have been successful.
This device isn’t close to being ready yet, but the idea is to turn glasses into an Apple Intelligence device. The product will analyze the surrounding environment and feed information to the wearer, though it will stop well short of true augmented reality.
Gurman also mentioned the AirPods with cameras product, which should make it even easier to use Visual Intelligence. That’s an AI feature that requires an iPhone 16 or iPhone 15 Pro. Press a button, and the AI can answer questions about your surroundings after getting access to a feed for the camera.
Using smart glasses and camera-enabled wearables would make it even easier to ask the AI questions about things you see in real life around you.
Previous reports have said that the first-gen Apple smart glasses and the AirPods with cameras will not launch until 2027. Considering the current state of Apple Intelligence and the huge gap between Apple’s public AI products and the competition, it’s probably best for Apple not to hurry such products to market.
Without dependable AI ready to handle all questions a user might have, products like smart glasses would be useless. At least the iPhone is still a reliable computer, even without Apple Intelligence. The smart glasses won’t have that luxury.
Meanwhile, Google might launch similar products of its own. The company detailed its Android XR initiative a few months ago, showing off smart glasses prototypes that came with Gemini support and some AR abilities. Those products are not available for sale, but Google will soon hold its annual I/O event which should cover all of its Gemini products, including wearable hardware.