Click to Skip Ad
Closing in...

I’d take AI smart glasses over an Apple Watch with a camera any day

Published Mar 24th, 2025 6:50AM EDT
Apple Watch Ultra 2 in Black Titanium.
Image: José Adorno for BGR

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Giving AI eyes to see the world around us is hugely important in the near term. We want chatbots like ChatGPT and Gemini to behave more like AI assistants that can help us with our immediate needs, including interpreting information from our surroundings. For example, you might want to save details from a poster or leaflet or have the AI translate things you don’t understand while traveling. AI can help with that.

Smartphones like the iPhone and Android devices already have cameras that can be used for AI. Google has built in Google Lens abilities into its products and that’s one way to go about it. The iPhone received Visual Intelligence this year, though the feature is available only on the iPhone 16, with the iPhone 15 Pros to follow.

Then, we have devices with cameras specifically made for the age of AI. The failed Humane Ai Pin is one example of that. The more successful Ray-Ban Meta smart glasses are another. Google and Samsung are also working on AI smart glasses that will feature AR capabilities.

Finally, we have existing devices that might get cameras in the near future to support AI features that require a video feed. Apple’s AirPods are one example. But it looks like Apple wants to add cameras to the Apple Watch for the same purpose, and the idea sounds horrific. I’d rather see Apple develop simple AI smart glasses with cameras like Meta’s Ray-Bans instead.

AirPods with cameras make some sense, considering how you’re wearing them. Even there, Apple has to figure out how to place the cameras so you don’t have to worry about moving your head awkwardly so the AI can see something you want to ask questions about.

But the Apple Watch would be even more difficult to use to have the AI look at your surroundings. I say that as someone who has been using Apple Watches since the first model and someone who wants this particular AI functionality from other products.

Using the Camera Control button to find out information about a restaurant with Visual Intelligence.
Using the Camera Control button to find out information about a restaurant with Visual Intelligence. Image source: Apple Inc.

Still, it looks like Apple is developing these products, which are “at least generations away from hitting the market.” According to Bloomberg’s Mark Gurman, Apple wants to add cameras to both premium Apple Watch models, including the regular Series and the Ultra.

The Series version will get a camera inside the display, while the Ultra will have one on the side, near the Crown and Side buttons. Gurman also explains how these camera placements would work:

Apple is probably considering this approach because the thicker Ultra has more room to work with. It would mean that an Ultra wearer could easily point their wrist at something to scan an object. A Series watch user, meanwhile, would have to flip over their wrist.

It’s probably safe to say I’ll never use a camera on the Apple Watch for any purpose, AI included. I don’t see myself buying Ultras, which would at least make it comfortable to use. But even then, I have the Digital Crown pointing towards the elbow, not the wrist. The camera would definitely not help in this instance.

As for “flipping over” my wrist to have the watch camera point at something, I’d rather not use AI.

I also worry about battery life if a camera is added to the Apple Watch. A camera would take up space inside, and it’s not like there’s spare room inside a device that’s getting thinner.

If Apple really wants to add Visual Intelligence features to more products, it should consider making dedicated products for AI. A pair of AI smart glasses that lets you chat with Apple Intelligence would be a much better option than Apple Watches with cameras.

Apple has bigger problems to solve until then. First, it has to fix Apple Intelligence. By that, I mean it has to ship the Apple Intelligence features that turned out to be vaporware.

The unreleased smart Siri needs to be functional before using Apple Intelligence with cameras. Visual Intelligence also needs upgrades, like using Apple’s own AI models rather than relying on Google and OpenAI. Gurman says that’s in the works, too.

Previous reports have said that Apple is studying smart glasses, but there’s no update on the matter in Gurman’s latest newsletter. He does say that Apple plans to release a Vision Pro 2 and a cheaper model in the future, though plans remain in flux considering the reception the first model got.

Also, Apple “is committed to eventually bringing true augmented reality glasses to market.” That process will take time, but Apple reportedly feels it has the ingredients to be a winner in this category. But there’s no in-between AI-only pair of smart glasses coming.

As for the camera-equipped AirPods and Apple Watch models, Gurman says Apple should launch them in 2027.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2007. When he’s not writing about the most recent tech news for BGR, he closely follows the events in Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming new movies and TV shows, or training to run his next marathon.