The White House has said explicit doctored images of singer Taylor Swift circulating on social media are “very alarming” and urged Congress to take legislative action against the phenomenon.
The sexually explicit images depicting the pop star, which appeared to have been developed by artificial intelligence, were widely shared on social media on Thursday in one of the first cases of so-called deepfake pornography going viral.
At a news briefing on Friday, Karine Jean-Pierre, the White House press secretary, said: “This is very alarming. And so, we’re going to do what we can to deal with this issue.”
Jean-Pierre said Congress should take legislative action on the issue and said social media companies have an important role to play in enforcing their own rules to prevent the spread of such misinformation.
The rise of sophisticated AI image-generation tools, which can depict realistic photos of celebrities and landscapes, has raised concerns that they will be used for non-consensual pornography. Sharing deepfake images is illegal in the UK under the newly introduced Online Safety Bill.