Policy Alert: New Law Addresses Deepfake Abuse
Our Privacy Policy has been updated! The Conference Board uses cookies to improve our website, enhance your experience, and deliver relevant messages and offers about our products. Detailed information on the use of cookies on this site is provided in our cookie policy. For more information on how The Conference Board collects and uses personal data, please visit our privacy policy. By continuing to use this Site or by clicking "ACCEPT", you acknowledge our privacy policy and consent to the use of cookies. 

CED Newsletters & Policy Alerts

Timely Public Policy insights for what's ahead

Action: The President signed legislation (S.146, Tools to Address Known Exploitation by Immobilizing Deepfakes on Websites and Networks Act, or the TAKE IT DOWN Act) that makes the posting of non-consensual intimate images (NCII) online, both authentic and AI-generated, without an individual’s consent a Federal crime (states have these laws, but there had been no Federal criminal provision until now). It requires covered online platforms to remove these depictions promptly upon notice. The law requires covered platforms to establish a process for victims to notify the platform of the existence of, and to request the removal of, this material.

Key Insights

  • Large technology companies have received mounting pressure in recent years from digital safety experts, women’s rights activists, and legal scholars regarding the increase in cases of NCII postings online. A review by Meta’s oversight board in July 2024 found that the company’s policies were not explicitly clear in covering ”the array of media manipulation techniques available today, especially generative AI.”
  • In April, the House passed the bipartisan TAKE IT DOWN Act nearly unanimously, 409-2. The Senate version of the bill, sponsored by Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN), was passed in February. The bill was championed by First Lady Melania Trump, who called the legislation a “national victory” that would help protect children against online exploitation.
  • The legislation received enormous support from over 120 organizations, including victim advocacy groups, law enforcement, and tech companies. Meta, Google, Microsoft, and Snap were among the tech giants to express support for the bill.  
  • The bill does not implicate Section 230 of the Communications Decency Act, a 1996 law that protects platforms from civil liability for what is posted to them, according to Sunny Gandhi, vice president of political affairs at Encode, an AI-focused advocacy group that supported the bill. Instead, the Federal Trade Commission will enforce the law using its powers against “deceptive and unfair trade practices.”
  • In particular, the law provides expanded legal remedies for victims of NCII sharing by establishing authorities at the FTC to compel platforms to adhere to content appeal and removal procedures. 
  • Critics of the new law say it is too broad in scope and could potentially lead to the censorship of legitimate images. The Cyber Civil Rights Initiative, a nonprofit that helps victims of online crimes and abuse, said in a statement that the takedown provisions remain “unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse.”

Policy Alert: New Law Addresses Deepfake Abuse

May 22, 2025

Action: The President signed legislation (S.146, Tools to Address Known Exploitation by Immobilizing Deepfakes on Websites and Networks Act, or the TAKE IT DOWN Act) that makes the posting of non-consensual intimate images (NCII) online, both authentic and AI-generated, without an individual’s consent a Federal crime (states have these laws, but there had been no Federal criminal provision until now). It requires covered online platforms to remove these depictions promptly upon notice. The law requires covered platforms to establish a process for victims to notify the platform of the existence of, and to request the removal of, this material.

Key Insights

  • Large technology companies have received mounting pressure in recent years from digital safety experts, women’s rights activists, and legal scholars regarding the increase in cases of NCII postings online. A review by Meta’s oversight board in July 2024 found that the company’s policies were not explicitly clear in covering ”the array of media manipulation techniques available today, especially generative AI.”
  • In April, the House passed the bipartisan TAKE IT DOWN Act nearly unanimously, 409-2. The Senate version of the bill, sponsored by Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN), was passed in February. The bill was championed by First Lady Melania Trump, who called the legislation a “national victory” that would help protect children against online exploitation.
  • The legislation received enormous support from over 120 organizations, including victim advocacy groups, law enforcement, and tech companies. Meta, Google, Microsoft, and Snap were among the tech giants to express support for the bill.  
  • The bill does not implicate Section 230 of the Communications Decency Act, a 1996 law that protects platforms from civil liability for what is posted to them, according to Sunny Gandhi, vice president of political affairs at Encode, an AI-focused advocacy group that supported the bill. Instead, the Federal Trade Commission will enforce the law using its powers against “deceptive and unfair trade practices.”
  • In particular, the law provides expanded legal remedies for victims of NCII sharing by establishing authorities at the FTC to compel platforms to adhere to content appeal and removal procedures. 
  • Critics of the new law say it is too broad in scope and could potentially lead to the censorship of legitimate images. The Cyber Civil Rights Initiative, a nonprofit that helps victims of online crimes and abuse, said in a statement that the takedown provisions remain “unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse.”

More From This Series

Newsletters & Alerts
Newsletters & Alerts
Newsletters & Alerts
Newsletters & Alerts
Newsletters & Alerts