The deepest fake: how new tech will test our belief in what we see
Originally posted by John Bailey @ smh.com.au
“President Trump is a total and complete dips---.”
Many may have voiced similar sentiments in the last few years, but you sit up and take notice when it's Barack Obama dropping this bomb direct to camera. A video released this month appears to show the former US president deploying this insult as part of a larger warning about the increasing sophistication of fake news.
The video itself, however, is as real as a one-legged unicorn.
The clip was created using a piece of software named FakeApp, and if you haven't heard of it, strap in. The freely downloadable program is artificial intelligence for dummies. Feed a few hundred images of a person into FakeApp and the program 'learns' their features, which can then be superimposed onto another person's head. It's not mere cutting and pasting; more like a digital mask.
If you're slapping my face onto that of someone screaming, I'll appear to be screaming too. If they're writhing in pleasure, I'll look to be getting my jollies as well (more on that below). With enough training, the mask can be taught to do or say anything a human face can.
The software has blindsided tech experts, who say that this level of desktop-accessible deep learning should have been years, if not decades, away. The community tinkering around with the scripts that eventually led to FakeApp only consists of a few hundred people around the world, mostly working out of sight of the media and entertainment industry at the vanguard of such technology.
Of course Hollywood special effects have been banging out all kinds of computer-generated illusions over the years, but the crucial surprise of FakeApp is that it can sometimes achieve results more convincing than big-budget studio fare. Last year's superhero caper Justice League was jeered at after Superman actor Henry Cavill's real-life moustache was digitally erased, leaving a weird and obvious pale smear in its place. Post FakeApp, a user took the original footage and gave the Man of Steel a more convincing shave job on a $500 second-hand computer.
FakeApp might be revolutionary, but like photography, film and home video before it, you could almost count the seconds between the technology's release and its first application to pornography. This is how the software has gained instant notoriety: it's now virtually synonymous with videos in which a porn star's face has been replaced by that of a celebrity. Wonder Woman star Gal Gadot was the first to receive this new form of abuse, which fits under the mantle of “involuntary pornography". In less than six months since, scores of fellow celebrities have been subjected to the same mistreatment.
The creator of FakeApp is known only by the online handle Deepfakes, which has also become the shorthand term for the faceswapped videos enabled by the software.
A community devoted to churning out deepfakes quickly arose on website Reddit, and while there were a handful of safe-for-work videos posted – mostly inserting actor Nicolas Cage's mug into incongruous situations – the overwhelming majority were non-consensual celebrity porn. Targets ranged from the obvious figures of geek fantasy – Scarlett Johansson and Game of Thrones stars – to public figures such as Michelle Obama.
The backlash was swift. Within weeks of its creation, Reddit deleted its deepfakes page and all related content, citing the same rules against involuntary pornography that prohibited revenge porn. Around the same time sites ranging from social media platforms Twitter and Discord to porn aggregator Pornhub announced similar bans against deepfakes content.
Deepfakes have even been condemned in the Australian parliament. In February a bill passed Senate legislating penalties of up to $105,000 for individuals sharing intimate images of others without consent, with a special provision including deepfakes under this category.
The bill also includes penalties of up to $525,000 for corporations guilty of spreading deepfakes – for Pauline Hanson this comes a decade after fake nude photos of her were splashed across newspapers.
Celebrity porn isn't the only moral violence FakeApp threatens to unleash. In the Not-Obama video described above, the face of the former president is being worn by Jordan Peele, writer and director of the Oscar-winning film Get Out. It's his way of warning us that "fake news" has just taken a stratospheric leap, with technologies such as FakeApp able to digitally erase the boundaries between the real and manufactured images.
At a time in which public trust in the media and politics is already under threat, the possibility that anything we view online could be a convincing bit of trickery concocted by some bad actor with a decent PC could further imperil the very sense of faith upon which democracies tend to rely.
Renee DiResta is one of the experts who recently advised the US Congress before its grilling of social media execs, and pinpointed the greatest danger FakeApp's reality-eroding technology poses while speaking on a recent panel hosted by the tech podcast IRL: “In the real world we go through life with the expectation of trust and we don't assume constantly that the person speaking to us is lying to us. That's because you can't make it through life if you assume that every interaction you have is false ... Society functions in part because of that expectation of truth.”
The anonymous creator of FakeApp is understandably publicity-shy, and hasn't responded to my attempts to secure an interview. One New York Times reporter did receive an email by someone purporting to be the original Deepfakes, who identified himself as a software developer in Maryland. He doesn't support the use of his software to create non-consensual pornography, he claims, but went on to say: “I’ve given it a lot of thought and ultimately I’ve decided I don’t think it’s right to condemn the technology itself — which can of course be used for many purposes, good and bad.”
As with other AI applications, he continues, “it’s precisely the things that make them so powerful and useful that make them so scary. There’s really no limit to what you can apply it to with a little imagination.”
In the interests of journalistic rigour I decided to give the FakeApp program a spin myself. I was thinking of something harmless: swapping the faces of my cats, say, or finally giving my two-year-old a turn as a Beyonce back-up dancer. Red flags were thrown up from the outset, however.
When the FakeApp installer launched I was monitoring the internet traffic on my computer. Without going into the nerdy details, there were enough shady handshakes and frozen heartbeats to have me thinking twice about forging ahead. It's often said that in the internet age, if you're not paying for a product you are the product, and it's not too paranoid to wonder if Deepfakes' program comes at a hidden cost.
The same day, other users determined that at least one version of FakeApp included a crypto-mining function. The program already puts a heavy strain on a user's computer; who would notice if some of all that number-crunching was not directed towards plugging an unsuspecting someone's face on a porn star's body and was instead digging for cryptocurrency to turn a tidy profit for whoever designed the thing?
Another version of FakeApp was apparently packaged with malware, and while Deepfakes himself has been quick to assure users that everything is on the level now, many are still alarmed that the software automatically updates itself with no option to switch the feature off. This isn't unlike handing your delivery person your house keys so they can drop around whenever they like.
It might be too late to stuff the deepfake genie back in the bottle. After Reddit booted its deepfakes community out into the cold, rogue sites began appearing that hosted the videos.
But just as the anonymous Deepfakes doesn't do much to foster trust in his software creation, the deepfakes community might end up eating itself from within. It wasn't long before many of the sites that replaced the Reddit forum were revealed to be crypto-scamming operations that hijacked your computer when viewing, or were just plain old malware-delivery vehicles
It's almost as if the kind of person who trades in involuntary pornography might not have everyone's best interests at heart. Imagine that.