"Meta also said it has expanded its list of terms, phrases, and emojis related to child safety and has begun using machine learning to detect connections between different search terms." https://www.theverge.com/2023/12/1/23983955/meta-child-safety-sexual-abuse-explicit-content-eu-senate
@sirius @ProstasiaInc In my experience, they often take action against non-offenders in order to justify continuing to ignore the actual abusers and abusive content
@elliot @sirius @ProstasiaInc But why do they do that?
@hinindil @sirius @ProstasiaInc Because their goal in content moderation is having a positive public image, not keeping people safe. Same reason they'll let Nazis stick around as long as it's not making the news
@elliot @sirius @ProstasiaInc But that's not what Content Moderation is all about!
@ProstasiaInc Big tech corporations profit off child abuse for years, only starting to take effective measures when public pressure begins to mount. And when they do, they blame pedophiles for everything they have failed to do and in an attempt of delayed virtue signalling put policies into place that target and discriminate against non-offenders as well as offenders. Doesn't matter if it's Reddit, Twitter, YouTube or now Meta, it's always the same old script.