A year after arriving in the US, Apple’s communication safety feature for the Messages app has begun rolling out internationally. The feature is meant to protect children from viewing or sharing images containing nudity in Messages.
When Messages sees that a child is about to view or send a nude photo, it will automatically blur the photo on the device. The app will then provide “guidance and age-appropriate resources” to ensure that they make the right choice. This includes contacting an adult they trust if they choose to do so.
Apple’s communication safety feature goes global
Apple announced this week that the communication safety feature is now rolling out to the Messages app on iOS, iPadOS, and macOS devices for users in the UK, Canada, New Zealand, and Australia. It is unclear precisely when it will be available in those regions. According to a report from The Guardian, the feature “will soon hit British iPhones.”
The feature debuted in iOS 15.2 last December in the United States.
Once the communication safety feature arrives, you will have to manually activate it in order to use it. Here are the steps you need to follow to turn it on:
- On your iPhone, iPad, or iPod touch, go to Settings > Screen Time. On a Mac, choose Apple menu > System Preferences, then click Screen Time. (If you haven’t already turned on Screen Time, use parental controls to turn it on.)
- Tap the name of a child in your family group.
- Then tap Communication Safety, and tap Continue.
- Turn on Check for Sensitive Photos. You may need to enter the Screen Time passcode for the device.
Initially, Apple planned to have the feature automatically notify a trusted contact if a child received or was attempting to send a photo containing nudity. The company instead ended up backtracking after critics pointed out several ways the system could be abused. Now, the feature will always give children a choice to share with their trusted contact.
Other new features for Apple device owners
Beyond the communication safety feature, Apple also introduced a feature for Siri, Spotlight, and Safari searches which provides “additional resources to help children and parents stay safe online and get help with unsafe situations.” For example, if you ask Siri how to report child exploitation, it will now point you directly to resources for filing a report.
This feature is also coming to the UK, Canada, New Zealand, and Australia.
Apple also announced a third feature last summer which would have automatically scanned photos for child sexual abuse material (CSAM) before they were uploaded to iCloud. Privacy advocates were up in arms about this particular feature, arguing that it would create a new backdoor that could put Apple device owners at risk. Apple ended up delaying this feature indefinitely, and as of today, it’s unclear if or when it will ever launch.
More iPhone coverage: For more iPhone news, visit our iPhone 14 guide.