Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
THE GUARDIAN
The FBI has charged a US man with creating more than 10,000 sexually explicit and abusive images of children, which he allegedly generated using a popular artificial intelligence tool. Authorities also accused the man, 42-year-old Steven Anderegg, of sending pornographic AI-made images to a 15-year-old boy over Instagram.
Anderegg crafted about 13,000 “hyper-realistic images of nude and semi-clothed prepubescent children”, prosecutors stated in an indictment released on Monday, often images depicting children touching their genitals or being sexually abused by adult men. Evidence from the Wisconsin man’s laptop allegedly showed he used the popular Stable Diffusion AI model, which turns text descriptions into images.
Anderegg’s charges came after the National Center for Missing & Exploited Children (NCMEC) received two reports last year that flagged his Instagram account, which prompted law enforcement officials to monitor his activity on the social network, obtain information from Instagram and eventually obtain a search warrant. Authorities seized his laptop and found thousands of generative AI images, according to the indictment against him, as well as a history of using “extremely specific and explicit prompts” to create abusive material.
Anderegg faces four counts of creating, distributing and possessing child sexual abuse material and sending explicit material to a child under 16. If convicted, he faces a maximum sentence of about 70 years in prison, with 404 Media reporting that the case is one of the first times the FBI has charged someone with generating AI child sexual abuse material. Last month, a man in Florida was arrested for allegedly taking a picture of his neighbor’s child and using AI to create sexually explicit imagery with the photo.