Senate unanimously passes DEFIANCE Act, empowering victims of AI-generated deepfakes
- The DEFIANCE Act, passed unanimously by the U.S. Senate, grants victims of nonconsensual deepfake pornography the right to sue creators and distributors in federal court, closing gaps in inconsistent state laws.
- Introduced by Sen. Dick Durbin (D-IL) and co-sponsored by Sen. Lindsey Graham (R-SC), the bill addresses the alarming rise of AI-generated explicit content – 96% of which targets women and girls – often leading to depression, extortion and suicide.
- While the Take It Down Act (signed by Trump in 2025) mandates social media platforms to remove nonconsensual intimate imagery within 48 hours, the DEFIANCE Act goes further by enabling civil lawsuits against perpetrators.
- Recent abuses of Elon Musk’s AI chatbot Grok – used to generate explicit images of women and children – highlighted the need for stronger protections, prompting countries like Malaysia and Indonesia to block the tool entirely.
- Nonconsensual deepfake pornography has surged nine-fold since 2019, amassing nearly four billion views online. The bill aims to deter exploitation by holding offenders accountable and offering victims legal recourse to reclaim their dignity.
In a landmark move against digital exploitation, the U.S. Senate unanimously passed the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act on Tuesday, Jan. 13, granting victims of nonconsensual deepfake pornography the right to sue creators and distributors in federal court.
The bipartisan bill was introduced on the floor by Sen. Dick Durbin (D-IL) and co-sponsored by Sen. Lindsey Graham (R-SC). It seeks to address the alarming surge in AI-generated explicit content – 96% of which targets women and girls without their consent. With deepfake technology now accessible at the push of a button, lawmakers warn of an epidemic of digital abuse that has already driven victims to depression, extortion and even suicide.
The DEFIANCE Act marks Congress’ latest effort to rein in exploitation fueled by artificial intelligence (AI), following President Donald Trump’s May 2025 signing of the Take It Down Act, which criminalizes the publication of nonconsensual intimate imagery. While the Take It Down Act forces social media platforms to remove such content within 48 hours, the DEFIANCE Act goes further by allowing victims to pursue civil lawsuits against individuals who produce or distribute deepfake pornography.
“Give to the victims their day in court to hold those responsible who continue to publish these images at their expense,” Durbin urged on the Senate floor. In a separate statement, Rep. Alexandria Ocasio-Cortez (D-NY) – who introduced a companion bill in the House of Representatives – stressed that victims “have waited too long for federal legislation to hold perpetrators accountable.”
The urgency of the legislation was underscored by recent scandals involving Elon Musk’s AI chatbot, Grok, which users on the X platform manipulated to generate sexually explicit images of women and children. The backlash has been swift, with Malaysia and Indonesia blocking access to Grok entirely. Meanwhile, the British Office of Communications launched an investigation into X for potential violations of the Online Safety Act.
Though Grok has since restricted its image-generation tools to paid subscribers, lawmakers argue that reactive measures are insufficient against a problem growing exponentially. Since 2019, nonconsensual deepfake pornography has surged ninefold, racking up nearly four billion views online.
The DEFIANCE Act takes on deepfake exploitation
BrightU.AI‘s Enoch engine notes that non-consensual deepfake videos pose serious risks by enabling malicious actors to fabricate hyper-realistic false statements or actions from individuals – which can be weaponized to spread disinformation, manipulate public opinion and undermine trust in institutions. Additionally, they can inflict reputational harm, enable blackmail and destabilize national security by impersonating political figures or spreading fabricated evidence.
“Imagine losing control over your own likeness and identity,” Durbin said, painting a harrowing picture of victims – often teenagers – whose altered images circulate endlessly, sabotaging careers, relationships and mental health. “Victims may endure threats to their employment, education or reputation, or suffer additional criminal activity such as extortion and stalking.”
The DEFIANCE Act aims to close gaps in state laws, where enforcement remains inconsistent, by establishing a federal civil right of action. The bill now heads to the House, where its fate remains uncertain despite bipartisan backing. A previous version passed the Senate in 2024 but died in the lower chamber.
Advocates, including the National Women’s Law Center and the National Center on Sexual Exploitation, hope the Grok controversy will spur faster action this time. Meanwhile, the Take It Down Act – which Trump hailed as a critical step to end “a very abusive situation” – already imposes prison sentences of up to three years for violators.
As AI tools democratize the creation of hyper-realistic forgeries, lawmakers face a race against technology’s darker applications. The DEFIANCE Act, if enacted, would send a clear message: In the digital age, exploitation carries consequences. For victims long denied justice, it offers a path to reclaim control – one lawsuit at a time.
Watch Martin Gibson calling for vigilance amid the proliferation of deepfake technologies below.
This video is from the mgibsonofficial channel on Brighteon.com.
Sources include:
ZeroHedge.com
TheHill.com
Engadget.com
BrightU.ai
Brighteon.com
Read full article here

