Estimated read time: 2-3 minutes
- Rep. Celeste Maloy cosponsors a bipartisan bill targeting nonconsensual deepfake pornography.
- The bill conditions platforms' liability protections on preventing stalking and abusive deepfakes.
- It clarifies Section 230 doesn't cover AI-generated content and mandates swift removal of harmful deepfakes.
SALT LAKE CITY — Utah Rep. Celeste Maloy is cosponsoring a bipartisan bill in Congress that aims to prevent the spreading of nonconsensual deepfake pornography by conditioning online platforms' liability protections on them taking steps to prevent stalking and "abusive deepfakes."
The Deepfake Liability Act would also clarify that online protections for platforms established under Section 230 of the Communications Decency Act of 1996 don't apply to content generated by artificial intelligence, and requires that platforms take steps to prevent stalking, respond to victim reports and remove harmful content that violates privacy.
Deepfakes are videos or images that are manipulated using machine learning or artificial intelligence to create seemingly realistic images or sounds.
The idea for the bill came from Rep. Jake Auchincloss, D-Massachusetts.
"He knew that this was something that Utah has been really forward thinking on, and so he approached me to see if I would like to sponsor it with him, because he figured Utah was the kind of state where people care about making sure we don't use AI to ruin other people's reputations and especially protecting kids," Maloy, R-Utah, told KSL NewsRadio's "Inside Sources" Thursday.
Section 230 is frequently invoked in the debate over how social media platforms moderate the content shared online. The law protects platforms from liability over what users post. While supporters say the law protects free speech and expression online, critics contend it allows social media companies to avoid responsibility for harmful or abusive content that is shared on their platforms.
Asked if the platforms are doing enough to protect users — particularly women and teenage girls, who are the most frequent victims of deepfake pornography — online, Maloy said, "they don't have any reason to do enough" because of the immunities afforded under Section 230.
"They will still have Section 230 immunities, but under this bill, those immunities are predicated on their duty to act," she added. "So, they have to have a duty to be making sure they don't have deepfakes, revenge porn kind of stuff on their websites. And as long as they're doing that, they're still covered, but they have to make sure that they are not the platform that allows people to create deepfakes that ruin people's lives."
Maloy said the bill is meant to be narrowly tailored to crack down on harmful deepfake content without otherwise restricting what can be expressed online. The bill would allow private citizens to sue companies over illegal content and requires the timely removal of unlawful material.
The issue is also one that is being tackled by Utah Attorney General Derek Brown, who in August led a coalition of 47 attorneys general in writing to search engines and payment platforms calling for better protections against deepfake pornography.
"Utah's should not have to worry about predators stealing their images and creating deepfake pornography," Brown said at the time. "It is alarming how many high school students across the country have become victims of this exploitation and harassment."








