Click to Skip Ad
Closing in...

Apple just announced a major change that has privacy advocates totally freaked out

Published Aug 8th, 2021 3:58PM EDT
Apple scanning photos
Image: naka/Adobe

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

If the thousands of security and privacy experts who’ve raised an outcry on social media over the past few days — and signed at least one letter calling for change — are correct, then Apple is about to make a staggeringly awful miscalculation. More specifically, they’re warning that a new feature set baked into the company’s software in the name of cracking down on one very specific, very horrible act (using iPhones in the service of child exploitation) will actually open the door to the very dystopian privacy nightmare that Apple’s own leaders have warned about for years. The target of this ire: The newly announced features that include Apple scanning photos in search of content related to the exploitation of children. This was announced by the company a few days ago, and represents an attempt by Apple to try and operate within the security paradigm it created for its own hardware and software, while also targeting people who use the company’s tools to hurt kids.

By the way, let’s not forget a few facts about Apple as we take a deeper look at this: This controversy has been generated by the same company that promises “What happens on your iPhone, stays on your iPhone,” according to the verbatim text of an ad the company displayed in Las Vegas at CES a couple of years ago. And it’s also the same company that’s made a number of compromises to appease the oppressive Chinese regime into allowing it to do business in the country. A country that amounts to a surveillance state.

Apple scanning photos = privacy nightmare?

As news about what’s coming from the iPhone maker continues to circulate, meanwhile, so do the urgent warnings. Experts from the Electronic Frontier Foundation, as well as former NSA whistleblower Edward Snowden and Facebook’s former chief security officer Alex Stamos, have been sharing insight and important context necessary for the larger discussion here.

A quick recap about what Apple is planning (with the new features arriving in tandem with iOS 15, iPadOS 15, watchOS 8 and macOS Monterey): The Messages app, according to the company, will use “on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.”

Meanwhile, Apple’s iOS and iPadOS software will use new cryptography applications. To “help limit the spread of (child sex abuse material) online, while designing for user privacy.”

Finally, Siri and Search updates will do a couple of new things along these same lines. They will intervene if a user searches for CSAM-related topics. And they’ll give parents and children information and help if they encounter unsafe situations.

Apple scanning photos from the Messages app

Here’s how Apple says the new Messages-related features will work. “The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.”

When receiving this type of content,” Apple continues, “the photo will be blurred and the child will be warned.” The child will also be “presented with helpful resources, and reassured it is okay if they do not want to view this photo.” Also, the child can  be told that, to make sure they are safe, their parents will get a message if they do view it. A similar protection will kick in if a child is the one who tries to send a sexually explicit photo. The child will get a warning before sending the photo, and their parents can also receive a message.

Here’s one of the things giving privacy advocates pause. Again, from Apple: “Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit.”

In the words of Snowden, who’s taken to calling Apple’s signature mobile device the SpyPhone: “If they can scan for kiddie porn today, they can scan for anything tomorrow.”

CSAM Detection

Here, meanwhile, is more of what Apple says is coming. Again, this is from the company’s description of how all this works. “New technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).

“… Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

Among the privacy promises that Apple insists on here is the following. “Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content.”

The company goes on to note that the threshold should provide an extremely high level of accuracy. And it supposedly ensures “less than a one in one trillion chance per year of incorrectly flagging a given account.”

Meanwhile, what about when the iCloud Photos account actually crosses that threshold mentioned above? In that case, Apple will manually review the report. The company will confirm there’s a match, disable the user’s account, and send a report on to the NCMEC. Users will also get the chance to appeal.

What’s next

Meanwhile, for anyone worried about the privacy implications of Apple scanning photos from users? The company has since shared a bit more about its plans. Following the initial outcry, that is.

First, these tools are coming only to the US at launch. Apple told at least one news outlet that the company will only roll this out globally on a country-by-country basis. Once Apple has conducted a legal evaluation specific to each new country, that is. However, Apple didn’t exactly help its cause by disseminating an NCMEC official’s message as part of an internal memo. The message to Apple employees who worked on this effort included a line denouncing all the criticism as “the screeching voices of the minority.”

“NCMEC also needs to dial it down a lot,” tweeted Matthew Green, who teaches cryptography at Johns Hopkins. “This is offensive. Your job is not to toss civil society and Apple’s customers under the bus.”

This thread from Stamos on the same subject, meanwhile, is also worth a read.

Andy Meek Trending News Editor

Andy Meek is a reporter based in Memphis who has covered media, entertainment, and culture for over 20 years. His work has appeared in outlets including The Guardian, Forbes, and The Financial Times, and he’s written for BGR since 2015. Andy's coverage includes technology and entertainment, and he has a particular interest in all things streaming.

Over the years, he’s interviewed legendary figures in entertainment and tech that range from Stan Lee to John McAfee, Peter Thiel, and Reed Hastings.