New from 404 Media: FBI arrested man for allegedly using Stable Diffusion, a text to image AI model, to create thousands of images of minors. One of the first known instances of FBI investigating someone for using AI to create child sexual abuse material 404media.co/fbi-arrests-man-fo

@josephcox From the article: “Today’s announcement sends a clear message: using AI to produce sexually explicit depictions of children is illegal..."

But... is it? Distributing, using images to lure real kids, etc, are all illegal. But is drawing your own CSAM illegal? There seems to be a world of e.g. snuff art out there that depicts illegal things, but the drawings themselves aren't illegal. How many people get murdered in comics or games? Is that art illegal? Should it be?

Follow

@KSargent @josephcox journals.sagepub.com/doi/10.11 The vast majority of sexual abusers did not use porn for that purpose, it seems. And the same probably applies to most people who engage in porn.

I won't speak for this particular technology, but the bigger problem is that *censorship itself* can be harmful to a lot of people, compared to a theoretical (or even a small concrete) risk. It's also likely they could just substitute it for something else to commit their crimes. And if someone does commit a crime (and as you noted, that is a crime), they can be punished for it. Also, censorship is likely to be unenforceable, although many harms of it would remain.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.