Sen. Young & colleagues unveil bill to protect victims of deepfake revenge porn

On Tuesday, U.S. Senators Todd Young (R-Ind.), Ted Cruz (R-Texas), and a bipartisan group of senators introduced the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act to protect and empower victims of non-consensual intimate image abuse – also known as “revenge pornography.”

The bill would criminalize the publication of non-consensual intimate imagery (NCII), including AI-generated NCII (or “deepfake pornography”), and require social media and similar websites to have in place procedures to remove such content upon notification from a victim.

The bipartisan legislation is co-sponsored by Senators Amy Klobuchar (D-Minn.), Cynthia Lummis (R-Wyo.), Richard Blumenthal (D-Conn.), Shelley Moore Capito (R-W.Va.), Jacky Rosen (D-Nev.), Ted Budd (R-N.C.), Laphonza Butler (D-Calif.), Joe Manchin (I-W.Va.), John Hickenlooper (D-Colo.), Bill Cassidy (R-La.), and Martin Heinrich (D-N.M.).

The internet is awash in NCII in large part from new generative artificial intelligence tools that can create lifelike, but fake, NCII depicting real people – also known as “deepfakes.” Disturbingly, this trend is increasingly affecting minors. A number of high-profile cases have involved young girls targeted by their classmates with deepfake NCII. Up to 95 percent of all internet deepfake videos depict NCII, with the vast majority targeting women and girls. The spread of these images – possibly in perpetuity if allowed to remain online – is having a profoundly traumatic impact on victims.

By requiring a notice and takedown process from websites that contain user generated content, including social media sites, the TAKE IT DOWN Act will ensure that, if the content is published online, victims are protected from being retraumatized again and again.

Young

“We are increasingly seeing instances where generative AI is used to create exploitative images of an individual based on a clothed image,” Sen. Young said. “This bipartisan bill builds on existing federal law to protect Americans, particularly young women, from harmful deepfakes and establishes a requirement for websites to take down this type of explicit and disturbing material. This is a sensible step to protect Americans and establish appropriate guardrails.”

While nearly every state has a law protecting people from NCII, including Indiana and 19 other states with laws explicitly covering deepfake NCII, these state laws vary in classification of crime and penalty and have uneven criminal prosecution. Further, victims struggle to have images depicting them removed from websites, increasing the likelihood the images are continuously spread and victims are retraumatized.

In 2022, Congress passed legislation creating a civil cause of action for victims to sue individuals responsible for publishing NCII. However, bringing a civil action can be incredibly impractical. It is time-consuming, expensive, and may force victims to relive trauma. Further exacerbating the problem, it is not always clear who is responsible for publishing the NCII.

The TAKE IT DOWN Act would protect and empower victims of real and deepfake NCII while respecting speech by:

  • Criminalizing the publication of NCII in interstate commerce. The bill makes it unlawful for a person to knowingly publish NCII on social media and other online platforms. NCII is defined to include realistic, computer-generated pornographic images and videos that depict identifiable, real people. The bill also clarifies that a victim consenting to the creation of an authentic image does not mean that the victim has consented to its publication.
  • Protecting good faith efforts to assist victims. The bill permits the good faith disclosure of NCII, such as to law enforcement, in narrow cases.
  • Requiring websites to take down NCII upon notice from the victim. Social media and other websites would be required to have in place procedures to remove NCII, pursuant to a valid request from a victim, within 48 hours. Websites must also make reasonable efforts to remove copies of the images. The FTC is charged with enforcement of this section.
  • Protecting lawful speech. The bill is narrowly tailored to criminalize knowingly publishing NCII without chilling lawful speech. The bill conforms to current first amendment jurisprudence by requiring that computer-generated NCII meet a “reasonable person” test for appearing to realistically depict an individual.

Click here to read the full bill.