Take It Down is an acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks, and is an attempt to mitigate the spread of nonconsensual imagery and deepfake content generated with artificial intelligence (AI).
What the Law Does
The law makes it criminal to publish nonconsensual intimate imagery, sometimes called revenge porn, or AI-generated equivalents, which have become a prevalent issue because of the circulation of technology that can quickly create such materials.AI-generated content now circulating on the internet includes videos that, while fake, often appear deceptively authentic and use the image of a real person.
The recently signed law gives online platforms a maximum of two days to remove explicit content after the identifiable individual in the materials submits a request.
Anyone convicted of intentionally distributing explicit images without the consent of the person whose likeness is used could face prison time.
First lady Melania Trump offered her appreciation for the bill’s passage at the time of the signing and warned about the dangers that children face online.
The Law’s History
The legislation was introduced by Sens. Ted Cruz (R-Texas) and Amy Klobuchar (D-Minn.) on Jan. 16 and picked up another boost of momentum because of Melania Trump’s public support.The first lady went as far as to tie it to her Be Best initiative, which is a campaign focused on the well-being of children.
Rep. Maria Salazar (R-Fla.), the bill’s sponsor in the House, said it would help to prevent cyberbullying and “suicide born out of shame.”
“It is outrageously sick to use images, the face, the voice, the likeness of a young, vulnerable female, to manipulate them, to extort them, and to humiliate them publicly, just for fun, just for revenge,” Salazar said on the House floor ahead of the vote.
Previous Immunity
Online platforms have long enjoyed immunity from repercussions of user-generated content because of Section 230 of the Communications Decency Act, which provides general immunity for computer services with regard to third-party content.The attorney pointed to the possibility that a service provider might argue that it would still enjoy protection under Section 230 if it were prosecuted under the “Take It Down Act,” and there is some debate as to whether the law is, in essence, a rollback of part of those protections.
“[This is] a question that a court may resolve as a matter of statutory interpretation ... whether the two sets of provisions can be harmonized in a way that gives effect to each,” Killion said.
The plaintiffs alleged that the platforms were strictly liable for failing to remove dangerous videos after notice was given, conveying harrowing stories of parents who lost children who participated in the challenges.
Outsiders’ Takes
Not everyone was in favor of the legislation, with Rep. Thomas Massie (R-Ky.) announcing the day of the House vote that he would not support the legislation.“The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests,” Kelley said.
“The law’s tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal. As a result, online service providers, particularly smaller ones, will likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it.”
Meta, parent company of both Facebook and Instagram, came out in support of the legislation, with spokesman Andy Stone saying in March 2025 that “having an intimate image—real or AI-generated—shared without consent can be devastating and Meta developed and backs many efforts to help prevent it.”
The National Center on Sexual Exploitation lauded the bill’s passage, saying the bill becoming law is “a testament“ to what people can achieve when they ”overcome political differences and work together to end sexual exploitation.”