National Coalition Against Deepfake Abuse
Don't Let An Algorithm Wear Your Face
NCADA tracks legislation, documents school incidents, and builds the tools and coalitions needed to protect victims of non-consensual deepfake imagery.
TAKE IT DOWN Act — Enforcement Deadline
37
days
10
hrs
39
min
43
sec
Platforms must remove reported content within 48 hours or face FTC action.
States with deepfake laws
NCMEC reports, H1 2025
Documented incidents
Target women & girls
Partnered with & recognized by
UNODC
United Nations
Center for AI & Digital Policy
AI Governance
Reality Defender
Detection Technology
Binghamton University
Hashbank R&D Partner
U.S. Women’s Caucus
Model Legislation
Five Rights Foundation
Children’s Rights
SDG AI Platform
UN Sustainable Dev.
NY Cyber Task Force
Cybersecurity Policy
NJ Div. of Civil Rights
State Enforcement
Understanding the Crisis
This is not a hypothetical threat
Non-consensual deepfake imagery is being created and distributed right now — in schools, workplaces, and online communities across every state in the country. Advances in generative AI have made it possible for anyone with a phone to produce realistic fake intimate images of real people. The targets are overwhelmingly women and girls.
Reports of AI-generated child sexual abuse material submitted to the National Center for Missing and Exploited Children surged from 4,700 in 2023 to over 440,000 in just the first half of 2025. Research consistently shows that approximately 99% of all deepfake pornographic content targets women and girls.
Victims describe lasting psychological trauma — anxiety, depression, social withdrawal, and in some cases suicidal ideation. For students, the harm follows them through their education and into their adult lives. Many discover that content resurfaces on new platforms years after initial takedowns.
Federal Legislation
The TAKE IT DOWN Act
Signed into law May 19, 2025 — the first federal law to criminalize non-consensual deepfake imagery.
Adult Penalties
Up to 2 years
Minor Victims
Up to 3 years
Platform Deadline
May 19, 2026
The legislation criminalizes the knowing distribution of non-consensual intimate imagery, including AI-generated content. It establishes platform accountability by requiring companies to remove reported content within 48 hours. The Federal Trade Commission oversees enforcement.
While the TAKE IT DOWN Act is an important first step, effective implementation requires robust enforcement, adequate FTC funding, and continued state-level action. NCADA is actively monitoring platform compliance as the May 2026 deadline approaches.
Our Work
Track. Document. Build. Advocate.
Legislative Tracking
50-state legislation database with bill-level detail, updated continuously.
Law Tracker →Incident Database
Verified deepfake abuse cases in American schools, mapped and documented.
Incident Tracker →Daily Intelligence
70+ sources aggregated daily — legislation, incidents, tech, and advocacy.
Daily Briefing →Key initiatives
Hashbank
In partnership with Dr. Yu Chen at Binghamton University's Watson School of Engineering, NCADA is building the only deepfake perceptual hashing infrastructure in existence — modeled on Microsoft PhotoDNA — deploying across Reddit, Discord, and Telegram.
United Nations Engagement
Co-hosting a UNODC side event at the United Nations on technology-facilitated abuse and cybercrime. NCADA's policy proposal has been received and circulated across UN agencies.
Model Legislation
NCADA's policy framework has been cited by the U.S. Women's Caucus as a model and has informed 25+ state bills addressing non-consensual deepfake imagery.
Platform Accountability Scorecard
Tracking how major platforms comply with the TAKE IT DOWN Act's 48-hour removal requirement ahead of the May 2026 enforcement deadline.
Why This Matters
Behind every statistic is a real person
Victims of non-consensual deepfake imagery frequently report experiencing anxiety, depression, social withdrawal, and lasting psychological harm. The violation of having your likeness weaponized without consent creates a form of trauma that researchers are only beginning to understand.
For students, the impact is especially severe. Academic performance suffers. Social development is disrupted. The content follows them across platforms for years. And without strong legal frameworks, survivors are left with few options for recourse.
NCADA works directly with survivors and advocacy organizations to ensure that policy solutions center the lived experiences of those most affected. Effective legislation must include accessible reporting, rapid content removal, and meaningful avenues for legal action against both perpetrators and negligent platforms.
Know your state's laws
Deepfake legislation varies dramatically. Some states have comprehensive protections. Others have none. Use our interactive tracker to understand your rights.
Open Law Tracker