Press Releases

Hickenlooper Cosponsors Bill to Crack Down on AI Deepfakes, Protect Minors

Jun 21, 2024

96% of online AI “deepfake” videos feature non-consensual pornographic content

TAKE IT DOWN Act would criminalize non-consensual publication of sexually explicit AI “deepfake” images


WASHINGTON – U.S. Senator John Hickenlooper joined a bipartisan group of Senate colleagues to introduce the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Actwhich would criminalize the publication of non-consensual, intimate imagery (NCII), including AI-generated “deepfake pornography”, on social media and other online sites, and require social media companies to have procedures to remove content upon notification from a victim.

AI innovation is going to change so much about our world, but it can’t come at the cost of our children’s privacy and safety,” said Hickenlooper. “We have a narrow window to get out in front of this technology. We can’t miss it.”

New generative artificial intelligence tools are able to create lifelike, but fake, imagery depicting real people, known as deepfakes. A 2019 report by Sensity found that non-consensual deepfake pornography accounted for 96% of the total deepfake videos online. Deepfakes have recently been used to target minors, including incidents where classmates used AI tools to create sexually explicit images of other classmates that they then shared on social media. 

The TAKE IT DOWN Act protects Americans by making it unlawful for a person to knowingly publish sexually explicit deepfake images of an identifiable individual, and requiring social media companies and websites to remove the images. 

Specifically, the TAKE IT DOWN Act would: 

  • Criminalize the publication of NCII: The bill makes it unlawful for a person to knowingly publish NCII on social media and other online platforms. NCII is defined to include realistic, computer-generated pornographic images and videos that depict identifiable, real people. The bill also clarifies that a victim consenting to the creation of an authentic image does not mean that the victim has consented to its publication.

  • Protect good faith efforts to assist victims. The bill permits the good faith disclosure of NCII, such as to law enforcement, in narrow cases.   

  • Require websites to take down NCII upon notice from the victim.Social media and other websites would be required to have in place procedures to remove NCII, pursuant to a valid request from a victim, within 48 hours. Websites must also make reasonable efforts to remove copies of the images. The FTC is charged with enforcement of this section.   

  • Protect lawful speech. The bill is narrowly tailored to criminalize knowingly publishing NCII without barring lawful speech. The bill respects first amendment protections by requiring that computer-generated NCII meet a “reasonable person” test. Meaning, it must appear to realistically depict an individual.


Full text of the bill is available HERE

###

Recent Press Releases