New from 404 Media: an 'AI girlfriend' site was hacked, revealing the prompts people use to get the AI to fulfill sexual fantasies. All linked to their email address
- most concerning is explicit discussion of child abuse
- other prompts are kinks etc
404media.co/hacked-ai-girlfrie

Follow

@josephcox Kinkshaming is wrong, Joseph. Who really cares what someone does with a bot?

The theory as to why it is called "child abuse material" (as I've covered before, it's actually an Australian phrase) is that it is supposed to involve someone being abused.

If you apply it to everything, then that dilutes it (a tactic which a few far right extremists with an agenda use to confuse people). That is just kinkshaming. We have seen you reference an associate of a far right grifter before.

qoto.org/@olives/1132046171305
Taboo fantasies are also fairly common, and the science does not point to engaging in a fantasy being bad (if there isn't a victim).

The privacy / security issue is concerning though.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.