In the months leading up to the iPhone X unveiling this past September, there were a number of conflicting reports about the overall design of the device. While most everyone agreed that the physical home button was being kicked to the curb in favor of an edgeless display, it wasn’t entirely clear if Apple would replace it with an eye-authentication scheme (Face ID) or if it would embed a Touch ID sensor into the display itself.
When the iPhone X with Face ID was officially introduced, it didn’t take long for a new narrative to emerge, namely that Apple opted for Face ID very late into the development cycle simply because it couldn’t figure out a way to incorporate a fingerprint sensor into an OLED display at scale.
In reality, Apple’s decision to go all-in with Face ID was made well over a year before the iPhone X introduction. The reason, quite simply, is that Apple believed it to be a superior alternative to Touch ID on a number of fronts.
As John Gruber noted a few months back:
There were, of course, early attempts to embed a Touch ID sensor under the display as a Plan B. But Apple became convinced that Face ID was the way to go over a year ago. I heard this yesterday from multiple people at Apple, including engineers who’ve been working on the iPhone X project for a very long time. They stopped pursuing Touch ID under the display not because they couldn’t do it, but because they decided they didn’t need it.
Incidentally, Apple’s design decision with the iPhone X hasn’t stopped other handset makers from trying to develop devices with embedded fingerprint sensors. And while Face ID still has a few kinks to work out, I think it’s fair to say that Apple’s decision to go with Face ID over an embedded version of Touch ID was the way to go.
Case in point: Marques Brownlee recently shared a video of a fingerprint sensor embedded into a display. The technology, which is called Clear ID, comes from Synaptics and is being demoed in a Vivo X20 Plus UD device.
So how does the technology measure up? Well it does seem to work quite well, but there are a few downsides worth noting.
Zalib Ali of iOS Hacker explains:
First it requires the device to have an OLED display otherwise it won’t work. Then the sensor that is placed under the screen is visible at certain angles and lighting conditions.
While seemingly minor, a company as detail-oriented as Apple likely couldn’t bear the thought of a sensor being visible beneath the screen.
Further, Ali cites an excerpt from the video where we learn that for the sensor to work properly, it must first shine a visible pulse of light onto a user’s finger.
It has to be shinning a light on your fingerprint for it to reflect back into the glass and read it. So when I put my finger on the indicator which is basically a helper for guiding my finger to the right place, you see a light shine for a second, a sort of pulse that’s long enough for it to read the reflection of my fingerprint on that sensor.
Are these deal breakers? Not necessarily. But the entire user experience seems a bit more involved than seamlessly picking up an iPhone X and having it authenticate you immediately via Face ID.
Moreover, Face ID is far more reliable and secure than Touch ID. An Apple support document on Face ID reads in part: “The probability that a random person in the population could look at your iPhone X and unlock it using Face ID is approximately 1 in 1,000,000 (versus 1 in 50,000 for Touch ID).”
Now is it possible that future iterations of the iPhone will at some point incorporate a Touch ID sensor into the display? While it’s certainly possible, it’s not likely to happen anytime soon. To this point, Apple executives have said that Face ID will be a cornerstone of the iPhone user experience for quite some time.