Key Takeaways:
A disturbing case in Maine has exposed a critical gap in state law that prevents police from investigating AI-generated child sexual abuse material, even when they know the perpetrator's identity.
A Maine man went to watch a children's soccer game. He snapped photos of kids playing. Then he went home and used artificial intelligence to take the otherwise innocuous pictures and turn them into sexually explicit images. Police know who he is. But there is nothing they could do because the images are legal to have under state law, according to Maine State Police Lt. Jason Richards, who is in charge of the Computer Crimes Unit.
Legal loophole creates enforcement crisis
Lt. Jason Richards, the commanding officer of the Computer Crimes Unit, said his team now has to discard any material that is touched by AI. This restriction comes at a time when his unit is already overwhelmed with cases.
In 2020, Richards' team received 700 tips relating to child sexual abuse materials and reports of adults sexually exploiting minors online in Maine. By the end of 2025, Richards said he expects his team will have received more than 3,000 tips. They can only investigate about 14% any given year.
The problem stems from Maine's outdated legal definitions. While child sexual abuse material has been illegal for decades under both federal and state law, the rapid development of generative AI — which uses models to create new content based on user prompts — means Maine's definition of those images has lagged behind other states.
Maine stands as national outlier
Across the country, 43 states have created laws outlawing sexual deepfakes, and an additional 28 states have banned the creation of AI-generated child sexual abuse material. According to research by Enough Abuse, 45 states have enacted laws that criminalize AI-generated or computer-edited CSAM, while 5 states and D.C. have not (as of August 2025).
"Maine's lack of a law at least labeling morphed images of children as child sexual abuse material makes the state an outlier," said Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered AI.
Explosive growth in AI-generated abuse material
The urgency of addressing this legal gap has intensified due to an alarming surge in AI-generated child sexual abuse material nationwide. The National Center for Missing and Exploited Children (NCMEC) reports that it received 67,000 reports of AI generated CSAM in all of 2024, and received 485,000 in the first half of 2025, a 624% increase.
"It's a canary in the coal mine," said Derek Ray-Hill, interim chief executive of the Internet Watch Foundation. "There is an absolute tsunami we are seeing."
Legislative response in the works
Rep. Amy Kuhn, D-Falmouth, who chairs the House Judiciary Committee, has been working to address the legal loophole. Kuhn said she plans to propose the expanded definition of sexually explicit material mostly unchanged from her early version when the Legislature reconvenes in January.
Earlier this year, Kuhn successfully passed legislation expanding Maine's "revenge porn" statute to include AI-generated content, but the version of the bill that passed expanded the state's pre-existing law against "revenge porn" to include dissemination of altered or so-called "morphed images" as a form of harassment. But it did not label morphed images of children as child sexual abuse material.
"It's not what could happen, it is happening, and this is not material that anyone is OK with in that it should be criminalized," Shira Burns, the executive director of the Maine Prosecutors' Association, said.
Future legislative action expected
Both law enforcement and legal experts express confidence that the Legislature will act in 2026. "We're on the tail end of addressing this issue, but I am very confident that this is something that the judiciary will look at, and we will be able to get a version through, because it's needed," Burns said.
Come 2026, both Burns and Kuhn said they are confident that the Legislature will fix the loophole because there are plenty of model policies to follow across the country.
The case highlights the ongoing challenge law enforcement faces as technology outpaces legislation, leaving children vulnerable to new forms of digital exploitation while authorities watch helplessly from the sidelines.
For more news and insights, visit AI Pulse on our website.
Read more