We just can’t help it. Especially when we’re having a video conference call with someone via FaceTime or another service, our eyes tend to stray away from our conversational partner’s eyes to somewhere we should not be placing full attention to. That’s right - the screen. Because why would we be staring at the camera when our partner’s face is visible on screen?

Interestingly, the issue was brought up due to the location of the front camera on current iPhones, which prevents users from making direct eye contact with their conversational partner via Facetime. To actually remedy their own design flaw, Apple will be implementing a new feature called FaceTime Attention Correction in the next iOS 13 update for iPhone XS and XS Max.

What the feature does is that it makes you appear as if you’re looking at the camera (and your partner) during FaceTime at all times. It uses advanced image manipulation to ensure your eyes stick to the camera even though in reality, you’re actually not. All this is done in real-time, of course. However, it’s not all that perfect as your eyes will appear warped if you have an object placed in between your eyes and your phone’s camera. Then again, how often does that happen? Check out the demonstration of the feature in the image above by Twitter user Will Signon.

The trial version of FaceTime Attention Correction is available now for testing on the iOS 13 beta. As mentioned before, it’s currently planned for the iPhone XS and XS Max for now, while Reddit users have reported that the trial version does work with the iPhone XR and iPad Pro as well. iOS 13 is expected to be released sometime in fall later this year.