I just read and enjoyed this:
The latest beta of iOS 13 came out, and there’s a feature called FaceTime Attention Correction which, on video calls, silently manipulates the image of your face so that you’re looking the other person directly in the eye. Which on first blush to me sounded cool (eye contact is good! Maybe?) but on further thought made me do a weird face.
Read “A lengthy ramble through many responses to that FaceTime Attention Correction tweet”
A good ramble with some interesting links for earlier research on gaze detection that I hadn’t been familiar with.
My gut reaction to the news of FaceTime manipulating video call imagery to redirect the gaze of callers was a feeling of uneasiness. Which is interesting, because I don’t find the idea of a “beauty filter” that removes skin blemishes etc. particularly irritating. Perhaps it’s because we’ve become accustomed to the idea of image manipulation for changing the appearance of a person in advertising and media, whereas the novelty of manipulating the behavior of a person is still troubling (c.f. deep fakes). Whatever the reason, these are useful reminders that digital communication is always mediated (even when it doesn’t feel that way, as in the case of video calling) as Matt Webb points out.