A new feature found in the third beta version of iOS 13 has begun attracting attention because it promises to bring some augmented reality magic to your FaceTime calls — specifically, to the way many of us on calls are seen by the other party to be looking down at the screen, instead of looking at the other person.
The new feature is called “FaceTime Attention Correction,” and it promises to bring a new level of intimacy (digitally altered though it may be) to your FaceTime sessions. Basically, your eyeballs are going to be automatically corrected, such that it looks like you’re staring straight ahead — you know, actually at the person — instead of looking down.
Based on some early reaction to the new feature, many users appear to be taken aback and thoroughly impressed at how good a job the attention-correction does:
There does seem to be a lack of clarity at this point about which devices the new feature will work with. One report has said it will only be available on the iPhone XS and XS Max, though it’s certainly possible that could expand to other devices as iOS 13 continues its rollout.
Regardless, it’s a safe bet this will be a well-received feature by the average FaceTime user out there. A feeling of connectedness with the person you’re calling is pretty much the whole point of using FaceTime, and this AR-based tweak that makes the person seem to fix their gaze constantly on you will only add to that feeling of being present and connected to the other person.
Another thing that doesn’t seem to be clear yet is whether this feature will correct the gaze when multiple people are on one end of a FaceTime call.
The feature is expected to also show up in the forthcoming iOS 13 public beta. Here, meanwhile, is another look at how the feature works: