[Image: Curtains mistaken for Ryan Gosling; original image supplied by Jomppe Vaarakallio, courtesy PetaPixel; click through to PetaPixel to see the Gosling’d image.]
While processing an image using AI-assisted software, a photographer named Jomppe Vaarakallio unexpectedly found actor Ryan Gosling’s face in the resulting image file. The software apparently mistook some window curtains, featuring just the right geometry of shade and folding, for the Canadian actor and thus inserted his face.
According to PetaPixel, this “shows you what happens when computer vision gets tripped up by what looks like a blurry face”—but, of course, it is also what happens when we put too much faith in pattern recognition as a viable form of analysis, whether it’s visual, textual, or otherwise.
Like playing Led Zeppelin records backward in the 1970s and straining to hear subliminal messages pledging allegiance to Satan in the noise, we could feed all our photos through AI programs and see what secret scenes of celebrity rendezvouses they uncover—famous faces hidden in tree leaves, carpets, and window shades, in clothes hanging inside closets and in the fur of distant animals. Use it to generate scenes in films and novels, like Blow-Up or The Conversation for an age of post-human interpretation.
In fact, I’m sure we’ll see the rise and widespread use of authoritarian AI analytics, fed a constant stream of images and audio recordings, finding crimes that never happened in the blur of a street scene or hearing things were never said in a citywide wiretap—call it the Gosling Effect—resulting in people going to prison for the evidential equivalent of faces that were never really there.
(Spotted via @kottke.)