Your face could end up on a different body without your consent. Photo/Getty Images.
The video showed the woman in a pink off-the-shoulder top, sitting on a bed, smiling a convincing smile.
It was her face. But it had been seamlessly grafted, without her knowledge or consent, onto someone else's body: a young pornography actress, just beginning to disrobe for the start of a graphic sex scene. A crowd of unknown users had been passing it around online.
She felt nauseous and mortified: What if her co-workers saw it? Her family, her friends? Would it change how they thought of her? Would they believe it was a fake?
"I feel violated - this icky kind of violation," said the woman, who is in her 40s and spoke on the condition of anonymity because she worried that the video could hurt her marriage or career. "It's this weird feeling, like you want to tear everything off the internet. But you know you can't."
Airbrushing and Photoshop long ago opened photos to easy manipulation. Now, videos are becoming just as vulnerable to fakes that look deceptively real. Supercharged by powerful and widely available artificial-intelligence software developed by Google, these lifelike "deepfake" videos have quickly multiplied across the internet, blurring the line between truth and lie.
But the videos have also been weaponised disproportionately against women, representing a new and degrading means of humiliation, harassment and abuse. The fakes are explicitly detailed, posted on popular porn sites and increasingly challenging to detect. And although their legality hasn't been tested in court, experts say they may be protected by the First Amendment - even though they might also qualify as defamation, identity theft or fraud.
Disturbingly realistic fakes have been made with the faces of both celebrities and women who don't live in the spotlight, and the actress Scarlett Johansson says she worries that "it's just a matter of time before any one person is targeted" by a lurid forgery.
Johansson has been superimposed into dozens of graphic sex scenes over the past year that have circulated across the web: One video, falsely described as real "leaked" footage, has been watched on a major porn site more than 1.5 million times. She said she worries it may already be too late for women and children to protect themselves against the "virtually lawless (online) abyss."
"Nothing can stop someone from cutting and pasting my image or anyone else's onto a different body and making it look as eerily realistic as desired," she said. "The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause. ... The internet is a vast wormhole of darkness that eats itself."
In September, Google added "involuntary synthetic pornographic imagery" to its ban list, allowing anyone to request the search engine block results that falsely depict them as "nude or in a sexually explicit situation." But there's no easy fix to their creation and spread.
A growing number of deepfakes target women far from the public eye, with anonymous users on deepfakes discussion boards and private chats calling them co-workers, classmates and friends. Several users who make videos by request said there's even a going rate: about $20 per fake.
The requester of the video with the woman's face atop the body with the pink off-the-shoulder top had included 491 photos of her face, many taken from her Facebook account, and told other members of the deepfake site that he was "willing to pay for good work :-)." A Washington Post reporter later found her by running those portraits through an online tool known as a reverse-image search that can locate where a photo was originally shared.
It had taken two days after the request for a team of self-labelled "creators" to deliver. A faceless online audience celebrated the effort. "Nice start!" the requester wrote.
"It's like an assault: the sense of power, the control," said Adam Dodge, the legal director of Laura's House, a domestic-violence shelter in California. Dodge hosted a training session last month for detectives and sheriff's deputies on how deepfakes could be used by an abusive partner or spouse. "With the ability to manufacture pornography, everybody is a potential target," Dodge said.
Videos have for decades served as a benchmark for authenticity, offering a clear distinction from photos that could be easily distorted. Fake video, for everyone except high-level artists and film studios, has always been too technically complicated to get right.
But recent breakthroughs in machine-learning technology, employed by creators racing to refine and perfect their fakes, have made fake-video creation more accessible than ever. All that's needed to make a persuasive mimicry within a matter of hours is a computer and a robust collection of photos, such as those posted by the millions onto social media every day.
The result is a fearsome new way for faceless strangers to inflict embarrassment, distress or shame. "If you were the worst misogynist in the world," said Mary Anne Franks, a University of Miami law professor and the president of the Cyber Civil Rights Initiative, "this technology would allow you to accomplish whatever you wanted."
Men are inserted into the videos almost entirely as a joke: A popular imitation shows the actor Nicolas Cage's face superimposed onto President Donald Trump's. But the fake videos of women are predominantly pornographic, exposing how the sexual objectification of women is being emboldened by the same style of AI technology that could underpin the future of the web.
The media critic Anita Sarkeesian, who has been assailed online for her feminist critiques of pop culture and video games, was inserted into a hardcore porn video this year that has been viewed more than 30,000 times on the adult-video site Pornhub.
On deepfake forums, anonymous posters said they were excited to confront her with the video in her Twitter and email accounts, and shared her contact information and suggestions on how they could ensure the video was easily accessible and impossible to remove.
One user on the social-networking site Voat, who goes by "Its-Okay-To-Be-White," wrote, "Now THIS is the deepfake we need and deserve, if for no other reason than (principle)." Another user, "Hypercyberpastelgoth," wrote, "She attacked us first. ... She just had to open up her smarmy mouth."
Sarkeesian said the deepfakes were more proof of "how terrible and awful it is to be a woman on the internet, where there are all these men who feel entitled to women's bodies."
"For folks who don't have a high profile, or don't have any profile at all, this can hurt your job prospects, your interpersonal relationships, your reputation, your mental health," Sarkeesian said. "It's used as a weapon to silence women, degrade women, show power over women, reducing us to sex objects. This isn't just a fun-and-games thing. This can destroy lives."
'More vulnerable'
The AI approach that spawned deepfakes began with a simple idea: Two opposing groups of deep-learning algorithms create, refine and re-create an increasingly sophisticated result. A team led by Ian Goodfellow, now a research scientist at Google, introduced the idea in 2014 by comparing it to the duel between counterfeiters and the police, with both sides driven "to improve their methods until the counterfeits are indistinguishable."
The system automated the tedious and time-consuming drudgery of making a photorealistic face-swapping video: finding matching facial expressions, replacing them seamlessly and repeating the task 60 times a second. Many of the deepfake tools, built on Google's artificial-intelligence library, are publicly available and free to use.
Last year, an anonymous creator using the online name "deepfakes" began using the software to create and publish face-swapped porn videos of actresses such as Gal Gadot onto the discussion-board giant Reddit, winning widespread attention and inspiring a wave of copycats.
The videos range widely in quality, and many are glitchy or obvious cons. But deepfake creators say the technology is improving rapidly and see no limit to whom they can impersonate.
While the deepfake process demands some technical know-how, an anonymous online community of creators has in recent months removed many of the hurdles for interested beginners, crafting how-to guides, offering tips and troubleshooting advice - and fulfilling fake-porn requests on their own.
To simplify the task, deepfake creators often compile vast bundles of facial images, called "facesets," and sex-scene videos of women they call "donor bodies." Some creators use software to automatically extract a woman's face from her videos and social-media posts. Others have experimented with voice-cloning software to generate potentially convincing audio.
Not all fake videos targeting women rely on pornography for shock value or political points. This spring, a doctored video showed the Parkland school shooting survivor Emma González ripping up the Constitution. Conservative activists shared the video as supposed proof of her un-American treachery; in reality, the video showed her ripping up paper targets from a shooting range.
But deepfakes' use in porn has skyrocketed. One creator on the discussion board 8chan made an explicit four-minute deepfake featuring the face of a young German blogger who posts videos about makeup; thousands of images of her face had been extracted from a hair tutorial she had recorded in 2014.
Reddit and Pornhub banned the videos this year, but new alternatives quickly bloomed to replace them. Major online discussion boards such as 8chan and Voat, whose representatives didn't respond to requests for comment, operate their own deepfake forums, but the videos can also be found on stand-alone sites devoted to their spread.
The creator of one deepfakes site, who spoke on the condition of anonymity for fear of judgment, said his 10-month-old site receives more than 20,000 unique viewers every day and relies on advertising to make a modest profit. Celebrities are among the biggest draws for traffic, he said, adding that he believes their fame - and the wealth of available public imagery - has effectively made them fair game.
The only rules on the site, which hosts an active forum for personal requests, are that targets must be 18 or older and not depicted "in a negative way," including in scenes of graphic violence or rape. He added that the site "is only semi-moderated," and relies on its users to police themselves.
One deepfake creator using the name "Cerciusx," who said he is a 26-year-old American and spoke on the condition of anonymity because he is afraid of public backlash, said he rejects personal requests because they can too easily spread across a school campus or workplace and scar a person's life.
Many creators fulfil such requests, though, to make a woman appear "more vulnerable" or bring a dark fantasy to life. "Most guys never land their absolute dream girl," he said. "This is why deepfakes thrive."
In April, Rana Ayyub, an investigative journalist in India, was alerted by a source to a deepfake sex video that showed her face on a young woman's body. The video was spreading by the thousands across Facebook, Twitter and WhatsApp, sometimes attached to rape threats or alongside her home address.
Ayyub, 34, said she has endured online harassment for years. But the deepfake felt different: uniquely visceral, invasive and cruel. She threw up when she saw it, cried for days afterward and rushed to the hospital, overwhelmed with anxiety. At a police station, she said, officers refused to file a report, and she could see them smiling as they watched the fake.
"It did manage to break me. It was overwhelming. All I could think of was my character: Is this what people will think about me?" she said. "This is a lot more intimidating than a physical threat. This has a lasting impact on your mind. And there's nothing that could prevent it from happening to me again."
Identity theft
The victims of deepfakes have few tools to fight back. Legal experts say deepfakes are often too untraceable to investigate and exist in a legal grey area: Built on public photos, they are effectively new creations, meaning they could be protected as free speech.
Defenders are pursuing untested legal manoeuvres to crack down on what they're calling "non-consensual pornography," using similar strategies employed against online harassment, cyberstalking and revenge porn. Lawyers said they could employ harassment or defamation laws, or file restraining orders or takedown notices in cases where they knew enough about the deepfake creators' identity or tactics. In 2016, when a California man was accused of superimposing his ex-wife into online porn images, prosecutors there tried an unconventional tactic, charging him with 11 counts of identity theft.
Danielle Citron, a University of Maryland law professor who has studied ways to combat online abuse, says the country is in desperate need of a more comprehensive criminal statute that would cover what she calls "invasions of sexual privacy and assassinations of character." "We need real deterrents," she said. "Otherwise, it's just a game of whack-a-mole."
Google representatives said the company takes its ethical responsibility seriously, but that restrictions on its AI tools could end up limiting developers pushing the technology in a positive way.
But Hany Farid, a Dartmouth College computer-science professor who specialises in examining manipulated photos and videos, said Google and other tech giants need "to get more serious about how weaponised this technology can be."
"If a biologist said, 'Here's a really cool virus; let's see what happens when the public gets their hands on it,' that would not be acceptable. And yet it's what Silicon Valley does all the time," he said. "It's indicative of a very immature industry. We have to understand the harm and slow down on how we deploy technology like this."
The few proposed solutions so far may accomplish little for the women who have been targeted, including the woman whose images were stolen by the requester "willing to pay for good work." After watching the video, she said she was livid and energiaed to pursue legal action.
But her efforts to find the requester have gone nowhere. He did not respond to messages, and his posts have since been deleted, his account vanishing without a trace. All that was left was the deepfake. It has been watched more than 400 times.