IΒ am scared the new #Apple#CSAM will make false positives for people taking pictures of children to send to their doctor, because, they got an injury at school, they have a medical condition or whatever else that is not a sign of guiltiness on the patient. I fear it will prevent people from taking pictures of children of other parent who they suspect are abusers as proof, in fear of retaliation.
Are there more black children on #CSAM databases? If yes, it could be bad news for black families. By using existing data that already comes from racist biais, AI detection of CSAM could perpetuates system of oppressions.