Legg Middle School, Coldwater MI April 2, 2026
At least 4 identified female students
Keith Johnson, 58, a former woodshop teacher and soccer coach, was found to have used an AI application to alter images of clothed underage female students into nude photographs. This discovery was made during a prior voyeurism investigation.
Reported by Detroit News
xAI/Grok lawsuit, Tennessee TN March 16, 2026
Three Tennessee teenagers (and potential class members)
Three Tennessee teenagers filed a class action against xAI, alleging a perpetrator used a third-party app licensing the Grok Imagine API to create sexually explicit deepfake images and videos of them. The perpetrator was arrested in December 2025.
Reported by NPR
Wichita, KS KS March 10, 2026
1 adult male
Neal Zoglmann received threatening text messages containing AI-generated nude photos depicting him in rooms of his own home. Criminals scraped his social media photos and used AI to generate fake nudes, then demanded payment.
Reported by Wichita
Traverse City Central High School, Traverse City MI March 10, 2026
Multiple students (number undisclosed; under investigation)
A student allegedly used AI to create compromising nude images of fellow classmates. The school immediately contacted law enforcement, who are now investigating the incident.
Reported by 9&10 News (Northern Michigan)
Fairfax County Public Schools, Northern Virginia VA March 5, 2026
Multiple students (predominantly female)
FCPS School Board confirmed a disturbing trend of students using AI to create fake nude images of classmates across multiple D.C.-area school districts. Board members publicly discussed the need for consequences and education.
Reported by NBC4 Washington
Fort Dorchester High School, North Charleston SC March 5, 2026
5 underage female students (high school age)
AI-generated sexually explicit images depicting five underage female students were posted across multiple social media platforms, with each victim's name written across the altered images. Two underage suspects, a male and a female student, were identified, and devices were seized.
Reported by Live 5 News (WCSC)
Lancaster Country Day School, Lancaster PA March 2026
59 female students
Two 14-year-old boys at Lancaster Country Day School created approximately 350 AI-generated nude images of at least 59 female students. They pled guilty to 59 felony counts and received juvenile probation and restitution.
Reported by Fortune
Tennessee (class-action lawsuit vs xAI) TN March 2026
3 teenagers
Three teenagers filed lawsuit against Elon Musk's xAI, alleging Grok tools were used to morph their real photos into explicitly sexual images.
Reported by washingtonpost.com
Lake Zurich High School, Lake Zurich IL February 26, 2026
40+ girls from grades 5 through 12
Students used AI to generate sexually explicit pornographic images depicting 40+ female classmates and younger students, then distributed them among peers. Administration immediately contacted police, who opened a criminal investigation.
Reported by FOX 32 Chicago
Teaneck, NJ (police warning) NJ February 25, 2026
Not specified (general community warning)
Teaneck Police Chief Andrew McGurr issued a public letter warning that students may be using AI tools to generate explicit images of classmates. Granted amnesty period through March 2, 2026, after which prosecution threatened under endangering-the-welfare-of-a-child statute.
Reported by Daily Voice
Plymouth-Canton Educational Park, Canton/Plymouth MI February 24, 2026
At least 8 women (former classmates)
Three former students conspired to hack into female classmates' Snapchat accounts, steal real intimate photos, and share them online. One bathing suit photo of a 17-year-old was digitally manipulated to appear nude, but the case was primarily a hacking and theft scheme.
Reported by ClickOnDetroit (WDIV)
Lebanon High School, Lebanon IN February 22, 2026
Multiple female students; at least one victim was 14–15 years old.
Joel Salinas, 18, a student, was arrested for posting AI-generated nude images of female classmates and other CSAM on X, along with their personal information. He admitted to using AI software and discussed plans to rape one victim.
Reported by FOX 59 Indianapolis
Tipton Middle School, Tipton IN February 11, 2026
At least 1 confirmed student victim (female)
Elijah Bogard, 26, a custodian and track coach, was found to have AI-generated CSAM after a Dropbox tip. He admitted to taking a student's Instagram photo and using an AI app to make it appear nude, and also possessed traditional CSAM.
Reported by WTHR (Indianapolis)
Uinta County, WY WY February 2026
At least 1 child (the defendant's own daughter)
Uinta County prosecutor disclosed a case where a defendant used AI to 'unclothe' an image of his own child to the point of lascivious display. Prosecution brought under existing Wyoming child pornography statutes.
Reported by Wyoming News
Itasca High School, Itasca TX February 2026
Multiple students (mostly underage high schoolers) and adult staff members from Itasca ISD and neighboring districts
A former Itasca High School student (juvenile) used an AI platform to create digitally altered, sexually explicit nude images of current students and staff from Itasca ISD and neighboring districts. A fellow student discovered the images on the suspect's cellphone and reported it to authorities. The suspect had withdrawn from the district in January 2026.
Reported by NBC 5 DFW
xAI/Grok class action, South Carolina SC January 23, 2026
Anonymous South Carolina woman (and potential nationwide class)
An anonymous South Carolina woman filed a class action after Grok transformed her clothed photo on X into a revealing bikini image posted publicly. X initially refused to remove it for three days.
Reported by Bloomberg Law
Ocala, Marion County, FL FL January 15, 2026
Child celebrities (identities not disclosed); AI-generated depictions
David Rutter, 19, distributed 257 files of CSAM including 4 AI-generated videos of child celebrities in sexual acts plus additional AI-generated videos of early pubescent girls. Attempted to hide a device in a bag of cat food.
Reported by WCJB
Isidore Newman School, New Orleans LA January 8, 2026
Multiple teenage female students; at least 2 identified Newman students; numerous unknown girls from Instagram
A teacher and tennis coach was initially arrested after NCMEC flagged online accounts uploading CSAM tied to him. A search of his devices revealed he had taken Instagram photos of teenage girls, fed them into an AI platform, and generated nude deepfake images. He also took unauthorized photos of 5th-grade girls in his classroom.
Reported by NOLA.com / The Times-Picayune
Corbett High School, Corbett OR December 2, 2025
At least 1 confirmed student; potentially more
An AI-generated photo depicting a nude high school student was created and circulated among students. The principal investigated internally for three days before reporting to the Multnomah County Sheriff's Office.
Reported by KATU
Radnor High School, Radnor Township PA December 2, 2025
At least 5 freshman girls
Freshman girls at Radnor High School began receiving messages that a male classmate had made AI-generated pornographic videos of them, depicting them nude and engaged in sexual acts. The male student admitted to making videos but claimed they showed 'dancing in thong bikinis.' The actual videos were never recovered by police.
Reported by Philadelphia Inquirer
Washington State Patrol, Pierce County WA December 2025
1 male trooper (Collin Pearson) + 1 other trooper depicted without consent
Washington State Patrol personnel created an AI-generated deepfake video depicting Trooper Collin Pearson, an openly gay man, kissing another uniformed male trooper on a roadside. The video included a derogatory voiceover and was circulated among multiple troopers, as part of what Pearson alleges was a broader pattern of discrimination.
Reported by FOX 13 Seattle
Vandalia Christian School, Greensboro NC December 2025
Multiple students; at least 2 victims' father spoke in court
Google flagged a sixth-grade teacher's Drive account to the National Center for Missing and Exploited Children via 7 CyberTips. Investigators found 111 images of CSAM on his devices, and that he had used AI and photo-editing software to superimpose students' faces onto explicit images of child sexual abuse material. Some images were reportedly created in a classroom setting.
Reported by FOX8 WGHP (Greensboro)
Corbett High School, Corbett OR December 2025
student body awareness
Students Julianne Huang and Richa Pandit presented to educators and administrators about explicit deepfakes, highlighting need for resources and training.
Reported by katu.com
Baltimore (citywide lawsuit vs xAI) MD December 2025
school principal
High school athletic director created and circulated a deepfake of the principal making discriminatory remarks.
Reported by Baltimore sues xAI over Grok deepfakes
Radnor, PA PA December 2025
multiple female students
Female students at Radnor High School victimized by classmate who allegedly altered their social media images into pornographic deepfakes.
Reported by 6abc.com
Aberdeen State Driver Licensing Office, SD SD October 31, 2025
At least 50 identified victims — women and girls whose driver's license photos were stolen
Mark Rathbun, a Senior Driver's License Examiner, used his access to the confidential driver's license database to steal photos, SSNs, and addresses of women and girls, then used AI to create deepfake pornographic images. Over 1 million images found on his devices.
Reported by South Dakota Searchlight
Westfield High School, Westfield NJ October 2025
A 17-year-old New Jersey girl (and other girls at Westfield High School)
A 17-year-old New Jersey girl sued the developers of ClothOff, an AI 'nudify' tool, after a classmate used it to create fake nude images of her and other girls at Westfield High School. The suit names AI/Robotics Venture Strategy 3 Ltd. and Telegram.
Reported by TechCrunch
Northview High School, Northview Public Schools MI October 2025
2 former students (at least one was a minor when the original photo was taken)
A health/PE teacher and boys basketball coach's ex-girlfriend discovered an AI-manipulated explicit image of two former students in a hidden folder on his iPad. He admitted to having 'the photograph generated' using an AI website, imposing nude bodies onto a legitimate photo to make the students appear naked.
Reported by WOOD TV8
Marion County / Eustis, FL FL September 29, 2025
2 juvenile girls (identifiable, known to perpetrator)
Lucius William Martin, 39, used AI nudify software to remove clothing from social media photos of two juvenile girls — a daughter of someone close to him and her friend. Attempted to destroy evidence by resetting his phone during arrest.
Reported by WCJB
Undisclosed location, Maine (children's soccer game) ME September 15, 2025
Multiple children photographed at a soccer game
A man attended a children's soccer game, photographed children playing, then used AI tools to transform those clothed photos into sexually explicit images. Police know his identity but could not charge him because AI-generated sexually explicit imagery of children was not criminalized under Maine law.
Reported by The Maine Monitor
U.S. District Court, Bangor, ME ME September 5, 2025
Unspecified minors
Jeffery Furlong, a former Maine state probation officer who oversaw people convicted of sex crimes, pled guilty to accessing CSAM. Investigators discovered he possessed at least one AI-generated image.
Reported by Bangor Daily News
North Polk Community School District, Polk County IA September 4, 2025
Multiple staff members
AI-generated deepfake videos depicting North Polk school staff members in inappropriate contexts were created and uploaded to TikTok. The district described the videos as threatening to safety.
Reported by Radio Iowa
North Polk, IA IA September 2025
school staff
North Polk Community School District investigated AI-generated deepfake videos of school staff uploaded to TikTok in inappropriate contexts.
Reported by radioiowa.com
Sixth Ward Middle School, Thibodaux LA August 26, 2025
At least 8 female middle school students (ages ~13) and 2 adults
Male students used AI 'nudify' tools to create sexually explicit deepfake images of female classmates, circulating them via Snapchat. A 13-year-old girl reported the images, and after seeing them on a bus, she physically assaulted the perpetrator, leading to her expulsion.
Reported by WAFB I-TEAM (Baton Rouge)
Alzada, Carter County, MT MT August 21, 2025
At least 1 child under 12
Shy Herbert McCutchan, 31, charged with sexual abuse of children for using AI to digitally alter an image of a Montana child taken from parents' public social media. Investigation started from NCMEC CyberTips.
Reported by Montana Department of Justice
Sixth Ward Middle School, Lafourche Parish LA August 2025
multiple female students and adults
Male student charged with 10 counts of unlawful dissemination of AI-created images of female students and adults. One victim reportedly expelled after physical altercation with accused creator.
Reported by nola.com
UBIC Academy, Holly Hill FL June 20, 2025
Multiple students at UBIC Academy
David McKeown, a 47-year-old sixth-grade teacher, used AI technology to create child sexual abuse material using photos of real children, some of whom were his own students. He disseminated, downloaded, and shared the material on Discord during school hours using the school's Wi-Fi network.
Reported by FOX 35 Orlando
Batavia High School / Northern Illinois University, Batavia & DeKalb IL June 2, 2025
At least 5 identified underage female victims (ages 13–17), plus thousands of additional unidentified images.
Michael B. Erickson, 19, an NIU student and Batavia HS graduate, was charged after authorities found over 20,000 images, including AI-generated deepfake nudes of former Batavia HS classmates, on his devices. He admitted to selling the images, with one victim as young as 13.
Reported by WGN-TV
Cypress, Harris County, TX TX May 27, 2025
Images depicted children (specific victims under investigation)
Brian Vincent Rausch, an FBI intelligence analyst, was arrested with over 1,000 explicit images including real CSAM and AI-generated images 'indistinguishable from actual children.' Search warrant executed at his home.
Reported by KPRC 2
Milwaukee WI May 2025
1 woman (unnamed elected official)
After a brief relationship ended, a former police officer subjected an unnamed elected official to a years-long harassment campaign. He used an AI 'nudify' app to digitally remove clothing from a clothed photo of her, then texted the manipulated nude image to her.
Reported by FOX6 Milwaukee
Valley City, ND ND April 15, 2025
At least 5 teenage girls from the local community
Mitchell Kohler, 49, searched Instagram for images of teenage girls he knew in the Valley City community, then used AI software to generate nude deepfake images. Police found 'before and after' photographs on his devices.
Reported by Fargo
Davies High School, Fargo, ND ND April 2025
Many victims across multiple Fargo-area schools (predominantly female students)
North Dakota BCI launched a probe after a West Fargo middle school resource officer discovered AI-generated CSAM created using a female student's face. Investigation expanded to Davies High School where dozens of students used Snapchat to create and distribute CSAM.
Reported by Fargo
Northern Illinois University, DeKalb IL April 2025
At least 5 identified former high school classmates (ages 13-17); thousands of additional unidentified victims
An NIU student used AI 'nudification' tools from his dorm room to create nude deepfake images of female acquaintances, mostly former high school classmates who were minors. He also sold these AI-generated images via Snapchat and Telegram. The investigation was triggered by a CyberTip from the National Center for Missing and Exploited Children.
Reported by FOX 32 Chicago
Cascade High School, Cascade IA March 25, 2025
44 female students
Male students used AI 'nudify' apps to create fake nude images of 44 female classmates from their social media photos. The images were shared among students, leading victims to issue a joint public statement and demand stronger district policies.
Reported by Telegraph Herald (Dubuque)
Newtown Middle School, Newtown Township PA March 14, 2025
~11–12 female students, all middle-school age
A male student at Newtown Middle School used AI technology to create deepfake pornographic images of approximately 11-12 female classmates. The images were reported to the principal by other boys, but the school waited approximately five days to contact police. A subpoena to Snapchat identified a second boy involved and additional victims.
Reported by Yahoo News/Bucks County Courier Times
Baranoff Elementary School, Austin TX March 12, 2025
At least 2 identified 10-year-old students; potentially more from his work at ~20 schools
A Texas DPS undercover operation identified a 5th-grade teacher's computer sharing CSAM via peer-to-peer networks. A search warrant uncovered over 365,000 CSAM files, and investigators found he had taken classroom photos of his students, then used an AI image generator to digitally 'de-clothe' at least two 10-year-old students. He had also worked as a substitute at approximately 20 other Austin ISD schools.
Reported by CBS Austin
Cascade, IA IA March 2025
44 female students
44 female students at Cascade High School had deepfake nude images created from social media photos by four male classmates. Victims formed 'Voices of the Strong 44' advocacy group. School told victims not to discuss incident.
Reported by telegraphherald.com
Caverna High School, Glasgow KY February 27, 2025
1 male student, age 16
Sixteen-year-old Elijah 'Eli' Heacock received a threatening text message containing AI-generated nude images purporting to be of him, accompanied by a demand for $3,000 to prevent dissemination. Heacock died from a self-inflicted gunshot wound that same night after sending a partial payment.
Reported by CBS News
Stuart, Martin County, FL FL February 7, 2025
AI-generated depicting children as young as infants
Leonel Alvarado-Lizano, 29, used AI to generate over 1,000 images of child pornography depicting adults sexually abusing children as young as infants. First case under Florida's new statute 827.072 at Martin County Sheriff's Office.
Reported by West Palm Beach
Legg Middle School, Coldwater MI February 2025
Multiple female students at Legg Middle School
A woodshop teacher and soccer coach was initially investigated after hidden cameras were found in his home. While seizing his phone for that case, investigators discovered he had photographed clothed underage female students and used an AI application to alter them into nude photographs. Many victims knew him as their teacher or coach.
Reported by The Detroit News
Multiple schools in North St. Paul/Maplewood/Oakdale and Stillwater MN January 22, 2025
At least 100 child victims
William Michael Haslach, 30, a school employee in multiple districts, discreetly photographed children in his care and used those images to produce AI-generated/morphed photos of them engaging in sexually explicit conduct. He also possessed over 800 files of CSAM.
Reported by FOX 9 Minneapolis
Brownwood, TX TX January 8, 2025
AI-generated (no identifiable real-child victims)
Daniel Weatherly, 42, used AI to create images of minors aged 5-8 in sexually explicit conduct via text prompts. Entirely AI-generated from prompts, not deepfakes of real children.
Reported by U.S. Department of Justice
Delaware County, OH OH January 2025
At least 10 adult women plus children of victims
James Strahler II, 37, used AI nudify apps (ClothOff, Undress AI) to create deepfake pornographic images of at least 10 women — primarily ex-girlfriends — and sent them to victims, their families, and coworkers as harassment and sextortion.
Reported by 404 Media
Columbus, OH OH January 2025
Multiple minor family members
Austin Pittman, 25, a former Army soldier, used AI tools to digitally alter clothed photos of his own minor family members to make them appear nude. Federal investigators found nearly 200 images on his phone.
Reported by Columbus
Hingham Middle School, Hingham MA January 2025
At least 2 female middle school students
A male middle school student used an AI 'nudify' website to create a deepfake nude image of a female classmate, which was shared in school hallways during school hours and via text message. The student who created the image admitted to it in a text message apology. The victim's mother described the image as looking 'real and not photoshopped'.
Reported by Boston 25 News
Oldsmar, Pinellas County, FL FL 2025
Multiple (specific identifiable victims not detailed)
Justin Ryan Culmo, 40, pled guilty to producing, possessing, and distributing tens of thousands of CSAM images including 8,500 AI-generated ones. Possessed approximately 85,000 CSAM images, 845 videos, and 8,500 AI images. Material involved children as young as infants.
Reported by ice.gov
Cascade, MT MT 2025
~450 images of female victims aged 6-17; multiple former classmates identified
Dalten Johnson, a former Cascade High School student, used AI to create CSAM from childhood social media photos of former classmates. He produced approximately 450 manipulated images of female children aged 6-17.
Reported by Montana
Gilmer High School, Gilmer County GA December 3, 2024
Multiple minors (76-count indictment indicates a significant number)
Ronald Richardson, a 48-year-old vending machine vendor who serviced machines at Gilmer High School, was accused of taking normal, clothed photos of minors from their social media and using AI technology to make them appear nude. A student reported Richardson had asked her for pictures via social media.
Reported by WSB-TV Channel 2 (Atlanta)
Aliso Niguel High School, Aliso Viejo CA December 2024
Multiple female students
A student at Aliso Niguel High School in Orange County used AI to create deepfake nude images of female classmates. The incident was reported to school administrators and local police.
Reported by Los Angeles Times
Spring Hill, Hernando County, FL FL November 19, 2024
Not specified as targeting identifiable individuals
Jack Holden Boston, 29, found with child pornography including AI-generated material. First case of its kind in Hernando County. Further forensic analysis yielded additional AI-generated content.
Reported by NBC Tampa
Corinth Middle School, Corinth MS November 19, 2024
8 identified female students, ages 14-16
The school's content-monitoring app flagged a 'severe/sexual' alert on a teacher's school-issued laptop. Investigation revealed the teacher had used an AI website to create sexually explicit videos of 8 female students, ages 14-16, by inputting their still images and detailed prompts. He uploaded the content to his personal Google Drive during the school day.
Reported by WMC Action News 5
Homer Middle School, Homer AK November 4, 2024
10–11 female middle school students
Two boys at Homer Middle School used artificial intelligence to create synthetic photographs of multiple female classmates. The investigation began when one girl reported the incident, leading to device seizures and a broader discovery of distribution to other classmates.
Reported by Alaska Public Media/KBBI
Santa Clara County schools CA October 2024
Multiple female students across several schools
Multiple schools in Santa Clara County reported incidents of students using AI tools to generate deepfake nude images of classmates. The Santa Clara County DA's office confirmed investigating cases across several schools.
Reported by San Jose Mercury News
NC State University, Raleigh NC October 2024
28 female college students (sorority members)
At least 28 sorority members at North Carolina State University had their social media photos used to create AI-generated pornographic images, which were posted on the pornographic website erome.com. The images showed victims' faces artificially imposed on nude women performing sexual acts, with a folder labeled 'NC State'.
Reported by WRAL (April 2025)
Maple Dale School, Fox Point WI October 2024
2 female students, age 13
A 13-year-old male student took photos of two 13-year-old female classmates from Instagram, ran them through an AI 'nudify' app to create deepfake nude images, and shared them via Snapchat.
Reported by FOX 6 Milwaukee
NewsChannel 5, Nashville TN October 2024
1 woman (Bree Smith); additionally, viewers were targeted by scams using her likeness
Unknown external actors created AI-generated deepfake images and videos placing meteorologist Bree Smith's face onto semi-nude and nude bodies, using them on hundreds of impersonator social media accounts to run sextortion scams. When she sought help, her employer allegedly dismissed her concerns, contributing to a hostile workplace.
Reported by CBS News
North Carolina State University, Raleigh NC October 2024
28+ female sorority members
A recent NC State graduate used AI tools to generate sexually explicit deepfake images of at least 28 female sorority members. He uploaded these images to a pornography website, organizing them into folders with victims' real names. A student discovered the content and reported it to campus police.
Reported by WRAL News
Mountain View Middle School, Goffstown NH October 2024
New Hampshire voters, democratic process, President Joe Biden (impersonated)
An AI-generated robocall impersonating President Joe Biden was sent to voters in New Hampshire, urging them not to vote in the state's primary election. The call utilized deepfake audio technology to mimic Biden's voice, aiming to suppress voter turnout.
Reported by unionleader.com
Dover-Sherborn Middle School, Dover MA September 16, 2024
11 female middle school students directly informed; additional female students had photos collected
Male middle school students allegedly created and shared AI-generated explicit images of 11 female classmates, accompanied by lewd messages. An earlier investigation found four male students had collected approximately 350 photos of female students on Discord. The AI-generation element of the images is contested by district officials.
Reported by Daily Gazette/Tribune
Plymouth MA September 11, 2024
7+ women, including a university professor (primary victim) and two minors
Over 16 years, a man cyberstalked a university professor and 6 other women, using AI tools, photo-editing software, and AI-driven chatbots to create fake nude and semi-nude images. He posted thousands of AI-generated pornographic images across nearly 30 platforms, doxxed victims, and programmed chatbots to impersonate the professor.
Reported by U.S. Department of Justice
New Mexico v. Snap Inc. sextortion lawsuit NM September 4, 2024
Minors (as alleged in the complaint)
New Mexico Attorney General Raúl Torrez sued Snap Inc., alleging Snapchat's design features foster CSAM sharing and facilitate sextortion. An undercover DOJ investigation found accounts attempting to coerce a decoy minor.
Reported by CNBC
Austin, TX TX September 2024
11 underage girls
Jack Bullington, 19, cropped faces of 11 underage girls onto nude bodies using AI and posted the images on X/Twitter. He also sent images overseas and agreed to pay for modifications. Nearly 100 altered images found.
Reported by Austin
Washington High School, Pensacola FL September 2024
30-50 female students from multiple high schools
An 18-year-old used AI nudify apps to digitally 'undress' photos of 30-50 young women from multiple Pensacola-area high schools. Over 175 manipulated photos found on his phone. His ex-girlfriend discovered them and alerted victims.
Reported by ABC Pensacola
Joint Base Elmendorf-Richardson, Alaska AK August 23, 2024
Real children known to the perpetrator
Army Specialist Seth Herrera, 34, was charged with using AI chatbots to create CSAM depicting real children he personally knew, including infants. Thousands of illicit images were found on his phones.
Reported by DOJ
San Francisco CA August 15, 2024
General public (implied by 'nonconsensual deepfake pornography')
San Francisco City Attorney David Chiu sued operators of 16 of the world's most-visited AI 'undressing' websites, which collectively had over 200 million visits. As of June 2025, Briver LLC settled, and ten websites are offline or inaccessible in California.
Reported by SF City Attorney
Vero Beach, Indian River County, FL FL August 13, 2024
AI-generated CSAM (not targeting identifiable individuals)
Phillip McCorkle, 38, a theater employee, used AI image generators to create child pornography and distributed it via Kik. Arrested as part of Operation Cyberstorm, a county-wide operation targeting child pornography.
Reported by WPEC
Multiple schools across South Korea KR August 2024
Thousands of female middle and high school students.
A widespread deepfake pornography ring targeted thousands of female middle and high school students. AI was used to superimpose their faces onto explicit images and videos, which were then shared on Telegram and other platforms. This crisis involved a significant number of victims across the country.
Reported by npr.org
Farmington, MO MO July 18, 2024
AI-generated depictions of prepubescent girls and teenagers
Joel Kerbrat, 70, a registered sex offender, used an AI image generation program to create CSAM. He was a member of a Discord group promoting AI-generated pornographic images and visited a website titled 'build a preteen' hundreds of times.
Reported by Eastern District of Missouri
Virginia Beach, VA VA July 2024
1 female, age 18
Maggie Southall Bartz, 18, received 12 AI-generated nude images of herself via Instagram DM. The perpetrator stole photos from her childhood modeling portfolio and used AI to create realistic nude images.
Reported by Hampton Roads
U.S. Military Academy at West Point, West Point NY July 2024
1 woman (described as a 'fellow service member')
A West Point cadet used AI to alter a publicly available photo of a woman, creating a fake nude image. He texted it to the victim asking 'how accurate is this?' and 'is this you?', then threatened to publicly release the manipulated image unless she sent him explicit photos, constituting AI-enabled sextortion.
Reported by DefenseScoop
The Woodlands, Montgomery County, TX TX June 28, 2024
At least 1 female, age 17
Roman Shoffner, 30, used an AI nudify program to remove clothing from a photo of a 17-year-old girl (his wife's teenage relative). His wife discovered the image on his phone. First AI child pornography arrest in Montgomery County.
Reported by FOX 26 Houston
Holmen, Wisconsin WI May 20, 2024
Non-real children (as depicted in AI-generated images)
Steven Anderegg, 42, was the first person federally arrested for AI-generated CSAM, allegedly using Stable Diffusion to create over 13,000 sexually explicit images of non-real children. A federal judge dismissed the possession charge on First Amendment grounds.
Reported by Engadget
Idaho Falls, ID ID May 1, 2024
Thousands of AI-generated images; also secretly filmed teenage girls at mall
Lloyd Perry, 52, a registered sex offender, used an AI image-generating website to create thousands of AI-generated images depicting children ages 9-12 in sexual scenarios described as 'sadistic in nature.' Also had a hidden camera filming teenage girls at Grand Teton Mall.
Reported by Boise
Lynbrook High School, San Jose CA May 2024
Multiple female students from cross-school peer groups
A Discord server was discovered containing AI-generated deepfake images targeting students across multiple San Jose-area schools. The images were created using photos scraped from students' social media accounts.
Reported by San Jose Mercury News
Lancaster Country Day School, Lancaster PA May 2024
59+ female students under 18
Two male students at Lancaster Country Day School created approximately 350 deepfake images targeting at least 59 girls under 18, using school photos, yearbooks, Instagram, TikTok, and FaceTime screenshots.
Reported by inquirer.com
Nevada High School, Nevada IA April 24, 2024
Multiple female students (exact number undisclosed)
AI-generated explicit photographs of female students were created using their social media photos and circulated across the high school and middle school. Parents publicly identified their daughters as victims. The Story County Sheriff's Office investigated but found no clear legal basis for charges at the time.
Reported by KCRG (ABC Cedar Rapids)
Maranatha Christian Academy, Brooklyn Park MN April 17, 2024
1 confirmed victim — a 13-year-old female student
Jason Polzin, 50, a counselor and softball coach, secretly recorded a 13-year-old female student changing clothes. A police search found 52 covert photos and 165 images of the victim's face superimposed onto computer-generated nude bodies.
Reported by FOX 9 Minneapolis
Fairfax High School, Los Angeles CA April 9, 2024
Number and demographics not publicly disclosed
LAUSD announced an investigation into allegations of inappropriate photos being created and disseminated within the Fairfax High School community. A preliminary investigation found the images were created and shared on a third-party messaging app.
Reported by FOX 11 Los Angeles
Whatcom County Sheriff's Office, Whatcom County WA April 2024
1 female detective (Samantha Robinson)
A detective allegedly created an AI-generated video using a real photo of a female colleague and another officer, manipulating it to depict the colleague having sexual contact with the other officer. The detective shared the video within the department, and the victim faced retaliation after reporting it. No internal investigation was initiated by the Sheriff's Office.
Reported by Cascadia Daily News
Fairfax High School, Los Angeles CA April 2024
Potential for numerous students across the district; the alert was a preventative and responsive measure to a widespread threat.
Following a surge in deepfake incidents across California, LAUSD issued a district-wide alert to parents and staff regarding the creation and circulation of AI-generated intimate images of students. The district emphasized the illegality and psychological harm of such acts, providing resources and guidance for reporting.
Reported by govtech.com
St. Thomas Aquinas Catholic Secondary School, London, Ontario CA April 2024
Female students in various Ontario schools.
Incidents were reported where deepfake technology was used to create and share non-consensual explicit images of students in Ontario schools. These cases involved the manipulation of real photos to generate fake nude content. The content was shared among peers, causing significant harm.
Reported by cbc.ca
Nevada Community School District, IA IA April 2024
multiple students
Parents reported AI-generated nude images of their children circulating at Nevada High School. School district initially said there was nothing they could do.
Reported by weareiowa.com
Laguna Beach High School, Laguna Beach CA March 25, 2024
Multiple female students
AI-generated synthetic or 'inappropriate' photos of students were created and circulated among the student body, primarily via text messages. The principal informed parents by letter, and the Laguna Beach Police Department assisted in the investigation.
Reported by CBS News Los Angeles
Beacon Christian Academy, New Port Richey FL March 19, 2024
At least 3 elementary school students
Steven Guy Houser, 67, a third-grade science teacher, used AI to morph yearbook photos of 3 students onto nude bodies. He told deputies he was 'curious about what his students looked like naked.' Investigation triggered by NCMEC CyberTipline.
Reported by Tampa Bay Times
Richmond-Burton Community High School, Richmond IL March 11, 2024
22–30 female students and 3 teachers
A student used AI 'nudify' technology to alter prom photos of female classmates into sexually explicit nude images, which were then distributed. Victim Stevie Hyder, 15, spoke out publicly after discovering deepfakes made from her photo.
Reported by Shaw Local / Northwest Herald
Laguna Beach, CA CA March 2024
multiple students
Laguna Beach High School launched investigation into AI-generated inappropriate images of students circulating among student body.
Reported by edsource.org
Beverly Vista Middle School, Beverly Hills CA February 21, 2024
16 eighth-grade girls, ages 13–14
Five eighth-grade boys used generative AI to superimpose 16 female classmates' faces onto AI-generated synthetic bodies. The images were shared through messaging apps, prompting a criminal investigation by police and the LA County District Attorney's Office.
Reported by NBC News
Oklahoma City OK February 2024
2 women (sisters, former coworkers)
A man used AI to generate explicit nude images of two of his former female coworkers, who are sisters, depicting them standing naked beside each other. He then posted these AI-generated images to the social media platform X (formerly Twitter).
Reported by OKCFOX / KOKH
Oklahoma City (Sebastian Gokool case) OK February 2024
A minor
An Oklahoma City man was arrested for allegedly engaging in online sextortion with a minor. He reportedly used social media to contact the victim and threatened to release explicit images if the victim did not comply with his demands. The investigation was conducted by the Oklahoma City Police Department.
Reported by okcfox.com
Beverly Vista Middle School, Beverly Hills CA February 2024
16 female students
Five 8th graders at Beverly Vista Middle School expelled for creating/sharing deepfake nude images of 16 female classmates using AI tools.
Reported by nbcnews.com
Jensen Beach, Martin County, FL FL January 26, 2024
At least 1 young girl (neighbor)
Daniel Clark Warren, 51, took photos of a young girl neighbor and used AI to digitally remove her clothing and place her in sexual situations. He was guided by online contacts on what software to use.
Reported by West Palm Beach
Lake Zurich High School, Lake Zurich IL January 25, 2024
Female students at Lake Zurich High School
A male student at Lake Zurich High School was charged with child pornography after allegedly creating and sharing AI-generated nude images of female classmates. The images were digitally altered to depict the victims without their clothing.
Reported by fox32chicago.com
Pikesville High School, Baltimore County MD January 16, 2024
1 (Principal Eric Eiswert)
The Athletic Director used OpenAI tools to create a deepfake audio recording of Principal Eric Eiswert appearing to make racist and antisemitic comments. The recording was emailed to teachers and went viral, resulting in threats against Eiswert. The perpetrator's motive was retaliation over a contract dispute.
Reported by CNN
Nashville (Bree Smith case) TN January 2024
Numerous minors across Tennessee.
The Tennessee Bureau of Investigation (TBI) and local law enforcement agencies issued warnings about a surge in online sextortion targeting children and teenagers. Predators often use social media and gaming platforms to connect with minors, coerce them into sending explicit images, and then demand money to prevent public dissemination. This trend has led to severe emotional and psychological trauma for victims.
Reported by cbsnews.com
Pikesville High School, Baltimore County MD January 2024
The athletic director, Baltimore County Public Schools.
A deepfake video depicting a Baltimore County Public Schools athletic director in a compromising position was circulated online. The video was digitally manipulated to falsely show the individual engaging in inappropriate acts.
Reported by washingtonpost.com
Aliso Viejo Middle School, Aliso Viejo CA 2024
At least 1 confirmed female victim, age 13; multiple additional victims
A student used AI software to create a synthetic photo of a 13-year-old female classmate by placing the girl's face on another body. The district's investigation found that the student had created AI-generated images of multiple student victims, and another student further shared the photos.
Reported by EdSource
Gilmer High School, Gilmer County GA 2024
Multiple minors (Gilmer High School students)
A vending machine vendor with access to Gilmer High School used AI to create images making minors appear naked. He also asked at least one student to send pictures via social media, prompting the student to report him to a school resource officer.
Reported by Yahoo News / WSB-TV Atlanta
Pinecrest Cove Preparatory Academy, Miami FL December 6, 2023
Approximately 24 boys and girls, ages 12–13
Two male students used an AI 'nudify' application to create nude deepfake images of approximately 24 classmates, both male and female, by using photos from the school's social media accounts. The fake explicit images were shared between the two boys. Administrators learned of the allegations and reported to Miami-Dade Police.
Reported by CBS News Miami
Weldon Valley High School, Weldona CO December 2023
3 underage female students
A student used generative AI to blend authentic images of three female classmates' faces and clothed bodies with computer-generated synthetic bodies. Police discovered the images on the student's school Chromebook after an automated alert.
Reported by Colorado Politics
Demopolis Middle School, Demopolis AL December 2023
6 female middle school students
Two male middle school students used an AI website to create deepfake child sexual abuse material by superimposing female classmates' faces onto pornographic images. The images were created off-campus and shared among students, leading parents to come forward publicly. School officials initially stated they could only address on-campus incidents.
Reported by WBRC Fox 6 News (Birmingham)
Lancaster Country Day School, Lancaster PA November 2023
60 total — all female; 48 were Lancaster Country Day School students and 12 were acquaintances. All but one were minors.
Two 14-year-old male students created approximately 347 deepfake nude images and videos of 60 female victims, mostly minors, using AI nudify apps. They harvested photos from social media and yearbooks, sharing the results in a private Discord chat. The school initially failed to contact law enforcement, allowing the abuse to continue for six months.
Reported by Philadelphia Inquirer
Multiple schools across Australia AU November 2023
Female students in various Australian schools.
Reports emerged of deepfake nude images of students being created and shared, primarily targeting female students in various Australian schools. These incidents have led to police involvement and significant disciplinary actions within the school system. The misuse of AI tools by students was a key factor.
Reported by cnn.com
Westfield High School, Westfield NJ October 20, 2023
30+ female students, all 10th-grade girls aged approximately 14–15
Male sophomore students at Westfield High School used the ClothOff AI app to create fake nude images of over 30 female classmates from their social media photos. The hyper-realistic images were distributed via Snapchat group chats. The incident was discovered when a boy inadvertently told one of the targeted girls.
Reported by CNN
Issaquah High School, Issaquah WA October 18, 2023
At least 7 female students (ages 14–15) and 1 adult female staff member
A 14-year-old male student used a web-based 'nudify' app to create AI-generated synthetic images of at least six female classmates and one adult female staff member. He distributed the results via Snapchat, text messages, and in person.
Reported by 404 Media
Aledo High School, Aledo TX October 2, 2023
Elliston Berry and approximately 9 other female classmates, all freshmen aged 14–15
A 15-year-old male student used an AI 'nudify' app to transform innocent Instagram photos of classmate Elliston Berry and at least eight other freshman girls into realistic-looking nude images. The deepfakes were distributed via an anonymous Snapchat account and circulated rapidly throughout the school. Berry discovered the images spreading among classmates the morning after homecoming.
Reported by WFAA
Aledo, Texas TX October 2, 2023
Elliston Berry and at least 8 other girls
Elliston Berry, 14, of Aledo, Texas, discovered a classmate used AI to create fake nude images of her and at least 8 other girls, distributed via Snapchat. Her family's efforts to remove images were unsuccessful until Sen. Ted Cruz's office intervened.
Reported by Texas Tribune
Multiple schools across the UK UK October 2023
Female students in various schools across the UK.
Several incidents were reported where deepfake nude images of female students were created and shared within school communities. These incidents often involved male students using readily available AI tools to generate explicit content from classmates' photos. The content was typically distributed through messaging apps.
Reported by lbc.co.uk
Issaquah High School, Issaquah WA October 2023
Students across various school districts in Washington.
Washington state officials issued warnings regarding the creation and sharing of AI-generated intimate images of students. Several school districts reported incidents, leading to investigations and increased vigilance.
Reported by 404media.co
Westfield High School, Westfield NJ October 2023
multiple female students
Male students at Westfield High School distributed deepfake nude images of female classmates. Francesca Mani became a national activist. No criminal charges filed.
Reported by cnn.com
Almendralejo, Extremadura ES September 2023
At least 20 female high school students.
Deepfake nude images of at least 20 female students from a high school were created and circulated among male students via WhatsApp. The images were generated using AI from real photos of the girls without their consent. This incident caused significant distress among the victims and their families.
Reported by euronews.com
Calabasas High School, Calabasas CA August 2023
1 confirmed female student, age 16
A 16-year-old female student accused a former friend of secretly photographing her and then using AI tools to place her face onto synthetic bodies. Some of the resulting images were allegedly uploaded to an exploitative material website and circulated at the school.
Reported by ABC7 Los Angeles
Bella Vista, AR AR July 21, 2023
AI-generated images of children ages 8-13
Michael Timothy Carey, 54, was arrested after detectives discovered 420 files of CSAM. During interviews, Carey admitted he had begun creating his own CSAM images using an AI generator, with a 'preferred age of eight to thirteen.'
Reported by KNWA/NWA Homepage
Bishop Kenny High School / Fletcher High School, Jacksonville FL July 2023
1 female student, age 16
A Fletcher HS student took a photo from Brooke Curry's Instagram (age 16, Bishop Kenny HS) and used AI to generate a nude deepfake, posting it to Snapchat tagged with her name. The image spread across multiple platforms for hours.
Reported by WJXT Jacksonville
Omaha, NE NE May 25, 2023
Hundreds of AI-generated CSAM images
An Omaha man investigated by HSI Kansas City after a CyberTipline report was found with 26 images of child sexual abuse and several hundred AI-created CSAM images and videos on his devices.
Reported by HSI
Klein ISD, Houston TX April 13, 2023
1 female teacher
A student digitally altered a teacher's face from social media onto revealing images and created a social media account using them to request money for more sexual images, attempting to monetize the deepfake content.
Reported by FOX 26 Houston
Shotwell Middle School, Aldine ISD, Houston TX April 2023
1 female teacher
Students created a deepfake pornographic video using a female teacher's face superimposed onto explicit content, then airdropped it to staff and students. The teacher identified the original video that had been altered with her face.
Reported by ABC13 Houston
Carmel High School, Putnam County NY March 2023
Students across various school districts in New York.
New York state officials, including the Attorney General, issued warnings about the creation and sharing of AI-generated intimate images of students. Several school districts reported incidents, leading to investigations and policy reviews.
Reported by washingtonpost.com
Bloom-Carroll High School, Carroll, OH OH 2023
At least 2+ female students, age ~15
A male student on the wrestling team took a beach vacation photo of 15-year-old Liz Cline from social media and used AI tools to create fake nude images of her and other female students, circulating them among peers.
Reported by Cincinnati
Bloom-Carroll High School, Carroll OH 2023
At least 1 confirmed female student (likely more)
A male student used AI-based online apps to create a fake nude image of 15-year-old sophomore Liz Cline from her social media photo, which was then circulated among peers. Liz's twin brother discovered the images.
Reported by Local 12 / WKRC (Cincinnati)
Los Angeles Police Department, Los Angeles CA September 2022
1 female LAPD Captain (Lillian Carranza)
A digitally doctored nude photograph was created to resemble LAPD Captain Lillian Carranza and was widely circulated among officers, who viewed it and made crude comments. When Carranza reported it, the Chief refused to issue a department-wide clarification, and no disciplinary action was taken. Carranza was hospitalized with severe health issues.
Reported by Police1 / LA Times
Marquette, Michigan MI March 25, 2022
Jordan DeMay (and 100+ victims including at least 11 minors targeted by the scheme)
Jordan DeMay, 17, died by suicide within six hours of being sextorted by Nigerian brothers Samuel and Samson Ogoshi, who used a hacked Instagram account. The brothers were extradited from Nigeria and sentenced to 17.5 years each.
Reported by NBC News
San Jose, California CA February 26, 2022
Ryan Last (and thousands of victims across multiple countries)
Ryan Last, 17, died by suicide after being sextorted by someone posing as a young woman. He paid $150 under threat before taking his life. Four Ivorian men were arrested in connection with the scheme in May 2025.
Reported by CBS San Francisco
Undisclosed school, Delaware DE 2022
Amelia Kramer (age 15) and over a dozen additional victims, nearly all minors
A Delaware State Trooper informed 15-year-old Amelia Kramer's family that her face had been digitally added to pornographic images and spread across the internet. Over a dozen victims identified alongside her, nearly all minors.
Reported by Delaware
Hallandale Beach, FL FL February 2021
1 female elected official (Sabrina Javellana, age ~23)
City Commissioner Sabrina Javellana discovered someone stole her social media photos and used deepfake technology to create nude images, posted on 4chan with misogynistic comments. Law enforcement said it was not a crime under existing FL law.
Reported by NPR South Florida
MacArthur High School, Levittown NY August 2019
~14 women, all former MacArthur High School students; many images depicted them as minors
Patrick Carey, a former student, created over 1,200 deepfake nude images of approximately 14 women who had attended MacArthur High School, some as young as 14. He posted them on a pornographic website with victims' personal information, encouraging harassment. Victims received sexually explicit voicemails and threatening messages from strangers.
Reported by NBC New York