A new feature found in the third beta version of iOS 13 has begun attracting attention because it promises to bring some augmented reality magic to your FaceTime calls — specifically, to the way many of us on calls are seen by the other party to be looking down at the screen, instead of looking at the other person.
The new feature is called “FaceTime Attention Correction,” and it promises to bring a new level of intimacy (digitally altered though it may be) to your FaceTime sessions. Basically, your eyeballs are going to be automatically corrected, such that it looks like you’re staring straight ahead — you know, actually at the person — instead of looking down.
Based on some early reaction to the new feature, many users appear to be taken aback and thoroughly impressed at how good a job the attention-correction does:
Okay, just FaceTime’d with @WSig and this actually works. Looking at him on-screen (not at the camera) produces a picture of me looking dead at his eyes like I was staring at the camera. This is insane. This is some next-century shit.
— Mike Rundle (@flyosity) July 2, 2019
There does seem to be a lack of clarity at this point about which devices the new feature will work with. One report has said it will only be available on the iPhone XS and XS Max, though it’s certainly possible that could expand to other devices as iOS 13 continues its rollout.
Regardless, it’s a safe bet this will be a well-received feature by the average FaceTime user out there. A feeling of connectedness with the person you’re calling is pretty much the whole point of using FaceTime, and this AR-based tweak that makes the person seem to fix their gaze constantly on you will only add to that feeling of being present and connected to the other person.
Another thing that doesn’t seem to be clear yet is whether this feature will correct the gaze when multiple people are on one end of a FaceTime call.
Guys – "FaceTime Attention Correction" in iOS 13 beta 3 is wild.
Here are some comparison photos featuring @flyosity: https://t.co/HxHhVONsi1 pic.twitter.com/jKK41L5ucI
— Will Sigmon (@WSig) July 2, 2019
The feature is expected to also show up in the forthcoming iOS 13 public beta. Here, meanwhile, is another look at how the feature works:
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin 🤘 (@schukin) July 3, 2019