Click to Skip Ad
Closing in...

iOS 18’s killer AI feature might be data privacy unlike anyone else

Published May 30th, 2024 6:50AM EDT
iPhone 15 Screen and USB-C port.
Image: Christian de Looper for BGR

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

When Apple takes the stage at WWDC 2024 in a few days, it’ll show the world how it’s catching up to rivals. Most reports say that iOS 18 will not deliver breakthrough innovations compared to what’s available from the likes of ChatGPT, Copilot, Gemini, and others.

However, the same reports have been saying for months something any longtime iPhone user would have predicted. Apple will deliver its own take on AI with iOS 18, which includes stronger privacy than what competitors can offer. The latter seems a given. Apple has no choice but to provide better privacy for AI features. It’s all thanks to the self-imposed privacy standards it implemented years ago.

A new report gives us more insight into the data privacy protections Apple has been developing for the cloud-based portion of AI features coming to iOS 18, including the revamped Siri. I hope everyone else copies Apple as we march towards increasingly personal AI experiences.

Personal AI, or artificial intelligence programs that know everything about you, are the future. I’d be willing to trust AI with personal information and access to my computers only if I knew all the data exchanges remain private.

The company that develops the AI doesn’t get to hold my information or use it for training purposes. That’s why I said more than once that I’m willing to pay for top AI features rather than get them for free.

With the iPhone, I know I’m paying for all software features. That includes iOS as a whole, free software updates, access to iMessage, future AI features, and security and privacy.

A report over the weekend detailed the various AI features coming to iOS 18. I said at the time that Apple has a big chance of outperforming rivals when it comes to the adoption of AI features.

It’s not just because Apple has a massive user base of iPhones that will run at least a portion of the upcoming AI features in iOS 18. It’s because many AI features Apple will deploy will run locally on the iPhone, building them into existing apps and making them easy to use.

That’s also great news for privacy. On-device AI is secure and private AI. No data leaves the iPhone to Apple’s servers.

The same report said that Apple’s next operating systems will determine whether an AI feature can run locally or has to be offloaded to Apple’s servers. Apple would also have to explain at WWDC how that data transfer is secured and how it’ll protect user privacy for cloud-based AI.

In that report, Bloomberg’s Mark Gurman said that Apple would tout the security of the M-series chips. They will power the data centers handling the AI requests. That’s not good enough, however. We need more details, and that’s where The Information’s new report comes in (via MacRumors).

Apple will supposedly build “black boxes” of user data when processing AI requests in the cloud. The AI servers will run on M2 Ultra and M4 chips. The Secure Enclaves in these chips will let Apple “help isolate the data being processed on its servers so that it can’t be seen by the wider system or Apple.”

Apple will reportedly go for a “confidential-computing approach” it’ll surely describe in great detail during the event:

With the confidential-computing approach, Apple will be able to handle processing of AI-related data in the cloud while making it extremely difficult for hackers to gain access to the data even in the event of a data breach. It would also reduce Apple’s burden of having to hand over personal data from its servers in the event of a government or law enforcement request.

Even if the report is way off, I’m still convinced Apple will deliver strong privacy standards for AI features. This could further help it differentiate itself from rivals and even surpass some of them.

I’m not saying that OpenAI, Microsoft, Google, and others lack strong security for the cloud-based AI features they offer. But Apple will force them to match its offerings. They’ll have to be even more transparent about how user data is handled and stored.

Again, the endgame here is personal AI; all these companies are building such experiences. Whether it’s ChatGPT, Copilot, or Gemini, we’re starting to get more advanced assistant features from these products.

Finally, there’s one additional benefit from the confidential-computing approach that Apple is pursuing. Apple reportedly plans to offload more processing power to the cloud for future wearable devices while ensuring the data remains secure and private. This could help Apple develop thinner and lighter Vision Pro versions and the AR glasses I’m waiting for.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.