If you’ve been using Apple devices for a while now, you already know why the first Vision Pro headset isn’t available in stores. That’s what Apple does with first-gen devices. The company gives the new product a launch event that predates the release date by several months. That way, developers have time to adapt their existing apps or create new experiences for the new hardware.
Apple gave us plenty of interesting Vision Pro demos at WWDC 2023. But we only saw one third-party app at the event: the Disney Plus experience for the spatial computer. The Vision Pro will need killer app ideas to truly succeed, and it looks like the first killer Vision Pro app might come from Apple. It’s called Visual Search, and it’ll work a lot like Google Lens on smartphones.
I said before that I’ll buy the Vision Pro the first chance I get. I plan on using the device as a spatial computer. Therefore, I’m more interested in the kind of apps and productivity experiences the Vision Pro can offer than immersive entertainment content.
Looking at app ideas for the Vision Pro, I said a few days ago that I’d like to be able to point to real-life objects and ask questions, with a focus on smart home gadget management. Here’s what I wrote at the time:
For example, some widgets could keep track of all the electronics at home, especially smart home devices. I could point to something and ask a question about automations or battery life. I could look at the thermostat and tell Vision Pro to give me the current settings. Or point to a camera and ask for the most recent footage.
It turns out that I was too narrow-minded when it comes to pointing and asking questions. Visual Search will not do what I envisioned above (yet), but the newly discovered feature will be incredibly important for the spatial computer experience. One that could potentially threaten Google Search on spatial computing.
Found by Steve Moser in the visionOS SDK, Visual Search lets you look at a real-life item and obtain more information. As MacRumors explains, you’ll be able to detect and interact with text from the real world, copy and paste printed text, and translate text between 17 different languages.
Visual Search will let you load web addresses you might see in the real world. It’ll also let you convert units from one system to another.
If this sounds familiar, that’s because Google Lens can already perform some of these actions on mobile devices. And I did say that Google’s new AI-infused apps seem ready to run on the Vision Pro.
But Apple already has Google Lens-like abilities inside iOS. You can use the iPhone’s camera to learn details about objects around you, whether it’s plants, animals, art, or landmarks (Visual Lookup). You can pick up text, interact with it, and translate the text with the camera. Also, the iPhone can read text in photos and videos (Live Text). And it can read QR codes and load up websites.
It looks like Visual Search will incorporate all these features under the same app. And Visual Search will sit inside the Vision Pro’s Spotlight search feature.
While Visual Search sounds like a Google Lens version or a mix of iPhone/iPad features, I’ll remind you of a key detail about the Vision Pro. The spatial computer should be faster than anything else when it comes to interaction.
You look at something and use your fingers to interact with it. That might include pointing at real-life objects and text for more information. Add voice control and Visual Search could speed up search experiences considerably.
Add a smarter Siri that incorporates generative AI features like ChatGPT and Visual Search will easily be one of the most important apps on your Vision Pro.