New from 404 Media: FBI arrested man for allegedly using Stable Diffusion, a text to image AI model, to create thousands of images of minors. One of the first known instances of FBI investigating someone for using AI to create child sexual abuse material https://www.404media.co/fbi-arrests-man-for-generating-ai-child-sexual-abuse-imagery/
@KSargent @josephcox https://journals.sagepub.com/doi/10.1177/00111287221115647 The vast majority of sexual abusers did not use porn for that purpose, it seems. And the same probably applies to most people who engage in porn.
I won't speak for this particular technology, but the bigger problem is that *censorship itself* can be harmful to a lot of people, compared to a theoretical (or even a small concrete) risk. It's also likely they could just substitute it for something else to commit their crimes. And if someone does commit a crime (and as you noted, that is a crime), they can be punished for it. Also, censorship is likely to be unenforceable, although many harms of it would remain.