A #Pornhub #Chatbot Stopped Millions From Searching for Child Abuse Videos - https://www.wired.com/story/pornhub-chatbot-csam-help/ interesting approach - could also be used to warn young people when they share material... #csam
@glynmoody On review, many of these do not appear to be CSAM keywords, but for content which is quite legal in the United States.
Another problem with this methodology is that a few people appeared to be using Pornhub as a "general purpose video hosting service" (or they appeared to be at some point prior to 2020), therefore overly inclusive keywords might not even correspond to sexual content.
You've been had. Again.
@olives an approach worth exploring, I think
@glynmoody I think that with this sort of thing, these sorts of messages can come off as overly accusatory, particularly when it isn't involving actual abuse.
They might not be useless, although that is one issue they tend to run into. Also, messaging that these are all "child abuse" searches is ripe for sensationalism and inaccurate.
@glynmoody In theory, something like that might be useful, although I have yet to see it being executed well, so I guess it is another "it depends on the execution" for me.
@glynmoody Hmm... From what I've heard of mental health services in the U.K., they seem to be doing badly since Brexit and Covid-19. Good general mental health would probably help in a number of ways when it comes to abuse.
@olives certainly