Deepfake (a portmanteau of "deep learning" and "fake"[1]) is a technique for human image synthesis based on artificial intelligence. It is used to combine and superimpose existing images and videos onto source images or videos using a machine learning technique called a "generative adversarial network" (GAN).[2] The combination of the existing and source videos results in a video that can depict a person or persons saying things or performing actions that never occurred in reality. Such fake videos can be created to, for example, show a person performing sexual acts they never took part in, or can be used to alter the words or gestures a politician uses to make it look like that person said something they never did.

Because of these capabilities, deepfakes have been used to create fake celebrity pornographic videos or revenge porn.[3] Deepfakes can also be used to create fake news and malicious hoaxes.[4][5]

Of course Wikipedia focuses on the sex tape aspect of the whole thing. But this isn't that. Bill Hader, formerly of SNL, and star of the great HBO comedy Barry is great at impersonations. Adding a deepfake Arnold face to him while he's doing the impersonation on Conan only makes it better.

More From 96.5 KNRX