The bill, supported by Melania Trump and Ted Cruz, mandates removal of such content within 48 hours.
Advocates praise the bill, but some worry about potential misuse and impacts on free expression.
The US House of Representatives today voted overwhelmingly to pass a bill aimed at cracking down on the posting of sexual images and videos of people online without their consent, including AI-generated “deepfake” nudes of real people.
The bipartisan Take It Down Act, which the Senate passed unanimously in February, now heads to the desk of President Donald Trump, who is expected to sign it into law. The bill makes it a federal crime to publish non-consensual intimate imagery, or NCII, of any person and requires online platforms to remove such imagery within 48 hours when someone reports it.
That would make it the first significant internet law of Trump’s second term and the first US law to take aim at the fast-growing problem of NCII. The bill’s passage delighted many advocates for survivors and victims of revenge porn and “sextortion” scams, while some free expression and privacy advocates said they worry it will be abused.
The legislation’s passage by a vote of 409-2 marks a victory for first lady Melania Trump, who has championed the bill as part of her “Be Best” campaign against cyberbullying. The President indicated in March that he plans to sign it – and quipped that it is a personal boon, “because nobody gets treated worse than I do online”.
Republican support for the bill galvanised when the first lady and Senator Ted Cruz (R-Texas), who co-authored the Senate’s version of the bill with Senator Amy Klobuchar (D-Minnesota), held a Capitol Hill roundtable in March with advocates who spoke about their personal experiences with NCII. They included Elliston Berry, who was 14 when a male classmate used an artificial intelligence app to create fake pornographic images of her and posted them to Snapchat. Another speaker, South Carolina state Representative Brandon Guffey (R), related that his 17-year-old son killed himself in 2022 after a sextortion scammer enticed him to send nude images and then blackmailed him.
“Today’s bipartisan passage of the Take It Down Act is a powerful statement that we stand united in protecting the dignity, privacy, and safety of our children,” Melania Trump said in a statement.
Hundreds of AI “undress” apps that can forge images of real people in seconds have proliferated across the internet in recent years, harnessing the same wave of technology that has powered image-generation tools such as Dall-E and Midjourney. Some of those apps advertise on mainstream social networks such as Meta’s Instagram, despite violating those platforms’ rules.
Among the most common targets are female celebrities, including singer Taylor Swift and comedian Bobbi Althoff, both of whom were the subject of sexually explicit AI fakes that went viral on Elon Musk’s social network X in 2024. The imagery is also often used to harass, intimidate or embarrass young women and teens. Those victimised have described their efforts to get non-consensual nudes scrubbed from the internet as a nightmarish game of whack-a-mole.
“Deepfakes are creating horrifying new opportunities for abuse,” Klobuchar said in a statement today. “These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable.”
Young girls and teens are often targeted by AI-generated sexual content. Photo / 123RF
Unlike several previous attempts to regulate social media harms, the act gained the support of some leading tech companies, including Meta, Google and Snap, clearing its path toward passage.
The House version, co-sponsored by Representatives María Elvira Salazar (R-Florida) and Madeleine Dean (D-Pennsylvania), passed today under a process known as “suspension of the rules”, in which bills that are not expected to be controversial can pass without debate with a two-thirds supermajority.
The act has its critics, however. Among them is Lia Holland, legislative director of the digital rights group Fight for the Future, who called it “well-intentioned but poorly drafted”.
Comparing the bill to the 1998 Digital Millennium Copyright Act, which requires online platforms to remove copyrighted material whenever someone declares it is being illegally used, Holland predicted that bad actors will use Take It Down to scrub from the internet legitimate content they dislike.
“If only Senator Cruz or anyone on the Hill had taken the time to make a few minor corrections, this would be a great bill,” Holland said.
Becca Branum, director of the nonprofit Centre for Democracy and Technology’s Free Expression Project, said the bill’s reliance on Trump’s “partisan” Federal Trade Commission raised concerns of “weaponised enforcement” for political ends.
Several Democratic lawmakers proposed amendments to the bill in a mark-up by the House Energy and Commerce Committee earlier this month, but the committee’s Republican majority voted them down. Meanwhile, some privacy advocates’ concerns that the bill could affect private messaging apps were eased when backers clarified that it is only intended to apply to public-facing online forums.
The bill also had supporters on the left, including several who spoke at a news conference today convened by the advocacy group Americans for Responsible Innovation. They included Fordham University law professor Zephyr Teachout, who said that unlike some past efforts to regulate social media, the Take It Down Act is well-crafted to survive First Amendment challenges in court.
Tim Wu, a Columbia Law School professor who advised former President Joe Biden on tech and antitrust regulation, also backed the bill, saying at the news conference that he hoped this is only the beginning of Congress taking a more active role in addressing social media’s harms.