Jan. 13 (UPI) — The U.S. Senate passed a bill that would allow victims of deepfake pornography to sue on Tuesday.
The bill, called “The Disrupt Explicit Forged Images and Non-Consensual Edits Act,” passed unanimously. If signed into law, it would give victims of the growing deepfake porn issue legal recourse against those who produce and distribute the content without their consent.
The bill has not advanced in the U.S. House since being referred to the Committee on the Judiciary in March.
Democratic Whip Dick Durbin, D-Ill., introduced the bill and said he worked with Rep. Alexandria Ocasio-Cortez, D-N.Y., on addressing the issue. Ocasio-Cortez is among the bill’s bipartisan group of sponsors.
“Give to the victims their day in court to hold those responsible who continue to publish these images at their expense,” Durbin said in the Senate chamber on Tuesday. “Today, we are one step closer to making this a reality.”
Deepfake pornography has become more common online since 2019, increasing nine times over. It is gaining even more attention of late as Elon Musk’s AI chatbot Grok has been generating and posting sexually explicit images of women and children on X in response to user prompts.
On Thursday, X changed its policy to allow only paid users to use Grok.
The Malaysian Communications and Multimedia Commission said in a statement that the change by X is not enough. Malaysia and Indonesia became the first countries to block Grok over the weekend.

COMMENTS
Please let us know if you're having issues with commenting.