The entire bill should be scrapped and set on fire. As it deserves. It's such a brazen abuse of state power.
https://reason.com/2023/06/30/chicago-police-raided-at-least-21-wrong-houses/
"Chicago SWAT teams relying on unverified search warrants to ransack houses; hold families, including children, at gunpoint; and, in one case, handcuff an 8-year-old child. In another case, 17 Chicago police officers burst into a family's house with their guns drawn during a 4-year-old's birthday party."
The UK government's announcement that websites will need to verify or estimate the age of users to stop kids from seeing pron ignores risks to privacy.
Collecting large pools of biometric data without proper governance structures in place will swap one harm for another.
➡️ Our response: https://www.openrightsgroup.org/press-releases/online-safety-bill-peers-need-to-consider-privacy-risks-of-age-verification/
https://reason.com/2023/06/29/florida-cop-jails-toddler-son-for-poopy-pants/
1) Uhhh.... I'm not sure it's appropriate to use jails for family discipline.
2) Well, yeah, Snapchat shouldn't be held liable for simply providing a "disappearing message" function.
If someone abuses it, as this teacher did to send a "sexually explicit image" to a student, she is responsible for doing so.
One of those "won't anyone please think of the children?" people pointed to a hundred people getting arrested on Discord for apparently being part of child porn rings and said "See! See! This shows the abuse there is completely out of control".
Completely ignoring that Discord is a platform with tens of millions of people or more.
Sigh.
https://edri.org/our-work/the-eus-internal-market-committee-votes-for-protecting-encryption-in-the-csa-regulation/
"As stipulated by the Court of Justice, IMCO MEPs say that tools must be able to distinguish between lawful and unlawful content without the need for independent human assessment."
This encourages firms to stop doing independent human assessments and to exaggerate the accuracy rate of their algorithms...
Also, any algorithm may fall over at scale. It is also still a violation of privacy.
Noting that sinister language like grooming obscures what they're really doing, which is probably looking for someone behaving inappropriately with minors.
They had one statistic where adults simply talking to minors was treated as a potential grooming event x.x
Instead of saying any of that though, they decided to self-immolate by making it appear as if they didn't care about child abuse.
https://www.techdirt.com/2023/06/28/social-media-was-useful-for-me-as-an-ill-nerdy-teenager/ Social media restrictions might be harmful to minors with health conditions.
"The question presented is whether the First Amendment still requires proof that the defendant had some subjective understanding of the threatening nature of his statements. We hold that it does, but that a mental state of recklessness is sufficient. The State must show that the defendant consciously disregarded a substantial risk that his communications would be viewed as threatening violence."
Software Engineer. Psy / Tech / Sex Science Enthusiast. Controversial?
Free Expression. Human rights / Civil Liberties. Anime. Liberal.