Ahead of tomorrow’s iPhone X launch, all the attention is on the TrueDepth camera sensor and Face ID, Apple’s new facial recognition system. Most of the discussion surrounds how the new unlocking system matches up to Touch ID, the fingerprint scanner it’s replacing. But several privacy groups are saying that TrueDepth’s real impact is going to be far more nefarious than anyone thinks.
According to Reuters, Apple will permit developers to record and store facial scan data from the TrueDepth camera, opening the door to all kinds of uses (and mis-uses) of the system.
The report says that “Apple allows developers to take certain facial data off the phone as long as they agree to seek customer permission and not sell the data to third parties,” which is a departure from the line that Apple has been pushing since it unveiled the iPhone X.
When it comes to Face ID, Apple is taking a hard line on privacy and security. The facial scans the phone uses to authenticate users are stored on the device, strongly encrypted — meaning that the NSA (or anyone else) physically can’t access them. But those rules don’t seem to apply to data collected by third-party apps.
“App makers who want to use the new camera on the iPhone X can capture a rough map of a user’s face and a stream of more than 50 kinds of facial expressions,” Reuters reports. “This data, which can be removed from the phone and stored on a developer’s own servers, can help monitor how often users blink, smile or even raise an eyebrow.”
The possibilities are virtually endless. For example, app makers could use the facial data to understand when users are actively looking at the screen, and force you to stare at an ad for at least 10 seconds before showing content. At the very least, app makers would be able to work out exactly how long users are interacting with an app with, down to the millisecond.
“The privacy issues around of the use of very sophisticated facial recognition technology for unlocking the phone have been overblown,” an ACLU spokesperson told Reuters. “The real privacy issues have to do with the access by third-party developers.” Adguard, an ad-blocking company, suggests that “brands learn to track and measure personal emotional responses to ads…but they also learn to use for ad targeting the emotions people already have. Facebook, for example, was caught targeting teens that feel vulnerable, “stressed”, “worthless” or “insecure” and selling them goods that could be “confidence boosters”.” Facial recognition could take that to the next level.