Click to Skip Ad
Closing in...

Apple stops developing CSAM detection system for iPhone users

Published Dec 7th, 2022 1:56PM EST
Image: Apple

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Last year, Apple announced that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of Child Sexual Abuse Material (CSAM) image hashes. While Apple wouldn’t see these photos since it would use on-device processing, it generated a lot of criticism from privacy and security researchers.

Now, after announcing a new Advanced Data Protection for iCloud, the company’s executive Craig Federighi has confirmed that Apple will not roll out the CSAM detection system for iPhone users as the company has stopped developing it.

The information was confirmed during an interview with The Wall Street Journal‘s Joanna Stern. At the time Apple announced the CSAM detection system, privacy and security researchers understood that it could be misused by governments or hackers to gain access to sensitive information on the phone. With that, Federighi announced Apple’s change of plans:

Mr. Federighi said Apple’s focus related to protecting children has been on areas such as communication and giving parents tools to protect children in iMessage. “Child sexual abuse can be headed off before it occurs,” he said. “That’s where we’re putting our energy going forward.”

For example, through its parental-control software, Apple can notify parents who opt in if nude photos are sent or received on a child’s device, but it will no longer develop a system to detect this inappropriate material in users’ photos.

iCloud Advanced Data ProtectionImage source: Apple Inc.

Apart from that, Apple announced today three important features coming to iPhone users in 2023: Advanced Data Protection for iCloud with 23 data categories now being totally end-to-end encrypted, Security Key, which let people use third-party hardware as a two-authenticator-factor, and iMessage Contact Key Verification so some users can discover whether who’s trying to reach them on iMessage are hackers or not.

“As customers have put more and more of their personal information of their lives into their devices, these have become more and more the subject of attacks by advanced actors,” said Craig Federighi, Apple’s senior vice president of software engineering, in an interview.  Some of these actors are going to great lengths to get their hands on the private information of people they have targeted, he said. 

José Adorno Tech News Reporter

José is a Tech News Reporter at BGR. He has previously covered Apple and iPhone news for 9to5Mac, and was a producer and web editor for Latin America broadcaster TV Globo. He is based out of Brazil.