The Apple Vision Pro is official, and Apple’s demos walked us through the company’s own vision (pun intended) for the spatial computer. But Vision Pro will need a lot of help from developers to make it an actual computer people will want to use. And I’ve already told you that I’ll get the Vision Pro headset the first chance I get, no matter what the price tag is.
But I’m not getting this thing so dinosaurs can break through my living room wall or only enjoy Disney Plus content on a huge virtual display. I want to use the Vision Pro for work, and I want to experience augmented reality (AR) features that will one day move to true AR glasses.
What I want from the Vision Pro
That’s where developers will help, and people already have brilliant ideas about what you could do while wearing the headset.
Before I look at some of them, I’ll briefly list the kind of apps I want to have on Vision Pro as soon as possible.
Generative AI like ChatGPT
Apple is far from announcing its own ChatGPT-like product, even though it’s working on one internally. And Siri is getting smarter this year, so I expect voice control to be decent enough on Vision Pro. But I’m also thankful that visionOS will run all iPhone and iPad apps in the App Store, including Open AI’s official ChatGPT app.
I think a strong virtual AI assistant is a key piece of software on Vision Pro and similar products.
1Password and similar password managers
Since Vision Pro will be a computer on my face, I’ll use it for work. I’ll therefore want to take advantage of Optic ID to log into apps and services. And I’ll need a password app for that. Hopefully, 1Password, Proton, and all the companies running secure password managers are also keeping an eye on the upcoming spatial computing era.
And no, I don’t want to use Apple’s secure password manager app buried inside the settings app.
Contextually aware widgets
I want the Vision Pro experience to be as clutter-free as possible. I don’t want app icons and widgets on the AR “desktop.” But I’d want widgets to appear on command on the screen.
For example, some widgets could keep track of all the electronics at home, especially smart home devices. I could point to something and ask a question about automations or battery life. I could look at the thermostat and tell Vision Pro to give me the current settings. Or point to a camera and ask for the most recent footage.
The same can be applied to any other electronic device around the home. Again, AI could help here, recognizing the products you have at home and detecting them on the wireless network that they’re on.
Apple Continuity for everything
Apple showed a Vision Pro use case where the wearer looked at their Mac screen. The Vision Pro picked up that display, allowing the user to continue working on that Mac app in AR.
I’d love the same thing to happen for the iPhone, iPad, and Apple Watch. I don’t want to take off the goggles to look at these screens. Vision Pro hands-on previews proved the video passthrough is so good that users could interact with displays. But a better use case scenario would be having the iPhone/Apple Watch display just show up on the Vision Pro when I need to use that device.
Similarly, Vision Pro could let me ask about battery life on each Apple device by pointing to them. And it could help me locate them in a room in case I lose one.
Other Vision Pro app ideas
The point here is that I’ve only started envisioning what the post-iPhone AR future would look like. And I’m only thinking about AR apps that would work with the current bulky design. The Vision Pro is still going to be somewhat heavy and prohibit certain types of activities. It’ll be relegated to the home, office, or for long commutes.
That’s why I didn’t mention AR navigation, which could be a thing with Google Maps right now. Because you wouldn’t wear Vision Pro outside. You wouldn’t, right?
The same goes for fitness apps which would be great in AR. You don’t want to break the display while you fall from a complicated Yoga pose.
But there are other interesting ideas that Redditors came up with while watching the WWDC keynote.
Tracking your vacuum
I’d say you should not vacuum with a Vision Pro headset. It’s a bad idea, and accidents can happen. But a Redditor thought that you might use the headset to see in real-time where you’ve already vacuumed.
That’s actually a pretty awesome idea, and it would ensure that you never miss an inch of your floor when you vacuum.
Your personal chef
A different person envisioned other scenarios where Vision Pro would be incredibly useful. Like cooking a new recipe. The headset would give you instructions showing the next steps, as it keeps track of what you’re doing.
AR Ikea instructions
Assemble Ikea furniture once, and you can assemble any Ikea furniture in the future without too much stress. Or, the thought of going through the instructions will make you die a little inside each time. Whatever category you fall into, having the Vision Pro display instructions in real-time and in 3D would be incredible.
Try furniture before you buy it
Rinse and repeat with anything that needs home assembly. AR should help significantly in the future.
The same Redditor suggested the next obvious thing, an AR experience that’s somewhat available with phones. You can use the Vision Pro’s depth sensors to place furniture items digitally in your home and see how they fit. This will make buying furniture a lot more exciting.
Apply the same principle to clothes and shoes, of course.
We won’t need to wait long to see actual apps
Remember that Apple hasn’t shown off any third-party apps for Vision Pro at the event, except for the Disney Plus app. It’ll be interesting to see what developers create once they get their hands on Vision Pro dev kits.