New Legislation Targets Nonconsensual Deepfakes
US President Donald Trump has signed into law the TAKE IT DOWN Act, a bill that criminalizes the creation and distribution of nonconsensual artificial intelligence-generated deepfake pornography. The law, backed by First Lady Melania Trump, aims to protect individuals from harmful and harassing content.
The legislation makes it a federal crime to publish or threaten to publish nonconsensual intimate images, including deepfakes, of both adults and minors with the intent to harm or harass. Penalties for violating this law range from fines to imprisonment. Websites and online services are required to remove illegal content within 48 hours of notification and establish a formal takedown process.
During the signing ceremony at the White House Rose Garden, Trump emphasized that the law covers “forgeries generated by an artificial intelligence,” commonly known as deepfakes. Melania Trump, who actively lobbied lawmakers to support the bill, hailed the legislation as a “national victory.”
“Artificial Intelligence and social media are the digital candy of the next generation — sweet, addictive, and engineered to have an impact on the cognitive development of our children,” Melania Trump stated. “But unlike sugar, these new technologies can be weaponized, shape beliefs, and sadly, affect emotions and even be deadly.”
The bill was introduced by Senators Ted Cruz and Amy Klobuchar in June 2024 and passed both houses of Congress in April of this year. This move positions the US among other countries like the UK, which criminalized sharing deepfake pornography as part of its Online Safety Act in 2023.
The need for such legislation is underscored by the growing number of cases where deepfakes are used for harmful purposes. A notable instance was the rapid spread of deepfake-generated illicit images of pop star Taylor Swift on platform X in January 2024, prompting the platform to temporarily ban searches for her name. Lawmakers subsequently pushed for legislation to criminalize the production of deepfake images.
According to a 2023 report by Security Hero, a security startup, the majority of deepfakes posted online are pornographic, with 99% of individuals targeted by such content being women.