Click to Skip Ad
Closing in...

Vision Pro needs Apple Intelligence the most, but it won’t get Apple AI this fall

Published Jun 18th, 2024 6:50AM EDT
Using the Vision Pro on a train via Travel Mode.
Image: Apple Inc.

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Apple gave the Vision Pro a big software update at WWDC 2024. visionOS 2 will make it easier to navigate the spatial computing experience and improve the available content, whether it’s brand new apps or the ability to turn any photograph into a spatial one. Apple also unveiled Apple Intelligence at WWDC. That’s the new suite of AI features for the iPhone, iPad, and Mac with a smarter Siri at the core.

Notably absent is Apple Intelligence support for the Vision Pro. The headset runs on the M2 chip, which will support Apple AI features on the iPad and Mac. But visionOS 2 won’t get Apple Intelligence powers later this year. It won’t have the smarter Siri in 2025. And I think that’s a big problem for Apple’s forward-pointing device. If anything, the Vision Pro should be at the forefront of any Apple AI initiative.

I’m saying this as a Vision Pro fan who is still torn about getting the spatial computer once it lands in Europe next month. I’m also a fan of ChatGPT and the Apple Intelligence feature. I can’t wait to get my hands on Siri and have the assistant do things for me in apps across operating systems.

It’s not the first time I’m telling you technologies like the Vision Pro headset and the Apple Intelligence software should go hand in hand. After WWDC 2023, I said Apple’s Vision Pro needed Al like ChatGPT as soon as possible. I explained that a product like the smarter Siri that Apple demoed last week would do wonders for a device where you have to interact with your eyes and hands:

Generative AI like ChatGPT is an easy fix for making the most of that terrific hardware and limited battery life. Rather than Siri or any other type of input mechanism, an Apple version of ChatGPT would take the Vision Pro to the next level.

You could just talk to the AI and tell it what you need from your immediate AR/VR experience rather than using your eyes and hands to navigate menus. You could run queries as you use Vision Pro to get work done or for fun. The faster AI produces the answers you need, the more time you’ll spend doing what you want to do on the gadget instead of opening apps and navigating menus.

The new Mac virtual display experience in visionOS 2.
The new Mac virtual display experience in visionOS 2. Image source: Apple Inc.

I also explained why the Apple Vision Pro and ChatGPT are the future of computing:

When Apple unveiled the Vision Pro, I said the headset would benefit from generative AI support. Siri, as it is now, isn’t good enough. But voice will be one way you interact with the computer of the future.

You’ll use your eyes to move the “cursor” and various hand gestures to control digital objects. But voice control will play a key role in spatial computing, especially once generative AI comes to Vision Pro. Voice will perfectly complement eye- and hand-tracking to get things done quickly. The best example that comes to mind is Minority Report computing. That’s where we’re heading.

Fast-forward to WWDC 2024, and we all saw Apple announce new Vision Pro features in visionOS 2 meant to make navigating and using the headset easier.

Apple introduced new hand gestures to access the Home Screen, check battery life, and access the Control Center on the Vision Pro. You only need to flick your hand and tap to access these often-visited locations on the headset. But wouldn’t it be easier to tell Siri AI to change a setting for you or bring up the Home Screen?

Apple also increased the size of the virtual display on Vision Pro when connected to a Mac so you can see more things at a glance. But with Siri AI, you might tell the assistant to do things for you on the Mac without needing visual confirmation. Hence, you wouldn’t want a larger, more crowded display to take up the field of view.

visionOS 2 will also show the Magic Keyboard when you’re using it. But with Siri AI, you could tell the headset to show or hide the keyboard. Even better, with the new Writing Tools in Apple Intelligence combined with Siri AI’s natural language processing, you might use your voice to produce text on the Vision Pro rather than typing it yourself.

What I’m getting at, again, is that Vision Pro is the device to see genAI features shine. But maybe we’ll have to wait for visionOS 3 for that to happen.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.