Technology to manipulate video in such ways was once extremely expensive and required teams of people to produce, but today deepfakes can be made by amateurs on their home computers. SnapChat filters and Facebook messenger filters can now overlay different cartoon faces and other effects on someone’s face in real time.
In January of 2018 someone took a video of actress Amy Adams singing “I Will Survive” and swapped her face for that of Nicholas Cage’s.875 Then in January 2019 someone made one by taking a segment of Jennifer Lawrence speaking with reporters backstage after the Golden Globe awards and put Steve Buscemi’s face in place of hers. The video was so bizarre and realistic looking that it became the most viral deepfake video since Jordan Peele’s Obama video, and introduced the term “deepfake” to a much wider audience.876 A few days later Stephen Colbert had Steve Buscemi on as a guest and asked him if he’d seen the video. He joked that he had “never looked better,” but underneath the laughs appeared to be a concern about what this technology was now capable of.877
In June 2019 a deepfake of Mark Zuckerberg was posted online showing him giving what looks to be an interview with CBS News, where he says, “Imagine this for a second: One man, with total control of billions of people’s stolen data, all their secrets, their lives, their futures. I owe it all to Spectre. Spectre showed me that whoever controls the data, controls the future.”878
It was a publicity stunt for a futuristic art and technology exhibit in the UK, but also was meant to serve as a warning for what problems technology may cause in the near future. CBS tried to get the video removed from Facebook because the deepfake was made from an interview Zuckerberg gave to CBS News and “violated their trademark.”879 Facebook wrestled with whether or not to remove the deepfake videos, but chose not to take any action, but their existence sparked a difficult conversation, which is what the makers intended.
The “Spectre” exhibit also commissioned the creation of a deepfake of Kim Kardashian which looked and sounded extremely realistic, unlike the Zuckerberg one which was an obvious fake. This one looked and sounded just like Kim Kardashian bragging about the power social media companies have over their users’ data, and concluded, “I feel really blessed because I genuinely love the process of manipulating people online for money.”880
Needless to say, she was not happy about it, and tried to have the video removed by filing copyright complaints against social media accounts that posted it.881 But these kind of satire videos are the least of celebrities’ concerns.
Deepfake Porn
Just like many early Internet entrepreneurs were quick to use the emerging new technology to share porn— allowing people to access it from their home computer instead of having to go out and buy magazines or VHS tapes from some seedy adult video store—one of the early uses of deepfake technology was to make fake porn videos depicting famous celebrities like Gal Gadot (Wonder Woman), Daisy Ridley (Star Wars), and Scarlett Johansson (The Horse Whisperer).
Celebrity deepfake porn videos were soon banned by PornHub882 and Reddit where users were posting clips they had made of their favorite actresses.883 While most of the videos weren’t being passed off as actual sex tapes, their creation obviously caused concern for those actresses whose likeness is now appearing in realistic-looking porn videos.884
Another concern is that since the software to create such fakes is widely available online, people could make fake sex tapes of someone in attempts to extort money from them, threatening to post the fakes online if they don’t pay up. Or scorned ex-lovers or those rejected by women could create deepfakes and post them online in order to “get back” at them.885
Information Warfare
When the Bush administration was planning for the invasion of Iraq in 2003, the CIA reportedly came up with the idea to create a fake video appearing to be Saddam Hussein having sex with a teenage boy. “It would look like it was taken by a hidden camera. Very grainy, like it was a secret videotaping of a sex session,” a CIA official later admitted to the Washington Post.886
The CIA also reportedly discussed making a fake video appearing as if Osama bin Laden and his lieutenants were sitting around a campfire drinking alcohol and talking about their “conquests with boys” as well, but another former CIA official with knowledge of the plan said, “Saddam playing with boys would have no resonance in the Middle East — nobody cares. Trying to mount such a campaign would show a total misunderstanding of the target. We always mistake our own taboos as universal when, in fact, they are just our taboos.”887
He was referring to the practice of “bacha bazi” which is an Afghani term meaning “boy play” that refers to sexual relationships between older men and young boys who are from very poor families or orphans and used as sex slaves by wealthy and powerful Afghanis.888 U.S. soldiers were reportedly told to ignore such abuse because it is part of the culture in regions of the Middle East.889 This abomination is a whole other issue, but the point is the CIA actually proposed making a deepfake of Saddam Hussein as a pedophile thinking it would incite people to rise up and overthrow him, because if such a video were real, people in a civilized culture would do just that.
Fake Photos
Nvidia, a video graphics card company, has created an AI so powerful that it can automatically change the weather in video footage, making a clip of a car driving down a road on a sunny day appear as if it was actually shot in the middle of winter with a few inches of snow on the ground and the leaves missing from the trees.890 The same technology can take photos of cats or dogs and change them to make them look like a different breed, and can change people’s facial expressions from happy to sad, or anything in between.891
Nvidia’s AI can even generate realistic pictures of people who don’t actually exist by taking features from actual photos and combining elements of them together into a composite that is almost impossible to tell that it’s fake.892 The website ThisPersonDoesNotExist.com uses this technology to display a different fake photo every time you visit it, most of them looking like HD photos of ordinary people.
AI can now create 3D models of people just from a few photographs, and while it may be fun to input a character in your favorite video game that looks just like you, the capacity for nefarious abuses of this technology are vast.
Fake Audio
In November 2016, Adobe (the creator of Photoshop) demonstrated what they called Adobe Voco, or Photoshop-for-voices, which can generate realistic sounding audio, making it sound like someone is saying something that they never actually said. The software works by inputing samples of someone’s voice, and then can create fake audio files in that same voice saying whatever is typed onto the screen.893
Dr. Eddy Borges Rey, a professor at the University of Stirling, said, “It seems that Adobe’s programmers were swept along with the excitement of creating something as innovative as a voice manipulator, and ignored the ethical dilemmas brought up by its potential misuse.”894
He continues, “Inadvertently, in its quest to create software to manipulate digital media, Adobe has [already] drastically changed the way we engage with evidential material such as photographs. This makes it hard for lawyers, journalists, and other professionals who use digital media as evidence.”895 Google has created similar software called WaveNet that generates realistic sounding human speech by modeling samples of people actually talking.896