Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
Repercussions come for AI-generated child porn
The case is novel. It’s the first time that the federal government has brought charges for child porn fully generated by AI. The government said that Anderegg created a trove of 13,000 fake images using the text-to-image generator Stable Diffusion, made by the company Stability AI, along with certain add-ons to the technology. This isn’t the first blow-up involving Stable Diffusion, though. In December, Stanford University researchers found that the dataset LAION-5B, used by Stable Diffusion, included 1,679 illegal images of child sexual abuse material.
This case could set a new precedent for an open question: Is AI-generated child pornography — for all intents and purposes under the law — child pornography?