I have concerns with this (proposed) bill (which messes with the "child porn" definition), which again, seems to be a bad "deepfake" bill idea. This time in the name of "saving the children" (which is always a red flag).
First off, this appears to be one of those narrow-minded tunnel-vision bills where someone thinks of *one particular thing*, but doesn't think of all the bad ways in which it could be applied. For instance, it uses a "reasonable person would regard it to be" test. But, then, an over-zealous prosecutor might squint at something which is quite unlikely to be that (i.e. a more realistic art style), and argue that it is. The only limiting term is "computer-generated", but then, that doesn't even have to imply the use of "AI" at all, does it?
It is also Unconstitutional, and could probably be dealt with in better ways. For instance, in the more narrow form of "sexual harassment" (which is probably what someone is thinking of here). That wouldn't involve inconvenient court battles, or human rights violations. I think that for the most part, people aren't really lining up to be evil for the sake of being evil, and I don't think "War on Drugs" type ideas are proportionate or effective.
So, I think this is a bad bill, and legislators should not advance it.
"The state originally pushed back against Smith's request, arguing that they did not have proper facilities and procedures to kill Smith through the experimental method. But the Supreme Court disagreed, denying cert to the state's attempt to overturn an earlier ruling allowing Smith to choose execution by nitrogen hypoxia."
"In an apparent attempt to save his life, Smith's lawyers have pivoted in recent months to instead argue that nitrogen hypoxia would lead to a tortuous death for Smith and that the experimental nature of the execution meant that the state could not guarantee a smooth execution."
This is a really strange case. However, they still shouldn't have been engaging in this sort of torture (and it's questionable why they feel such urgency to put someone to death).
From now on, we will refer to "Wisconsin", not as "Wisconsin" but as "Child Rape Loving State Wisconsin" because they love to promote more child rape in the name of being "tough on crime".
Anytime we refer to this State in future, we will use this name. And we will continually shame them, until they repeal any and all unconstitutional laws.
This post will be cited each time.
@klausfiend @lauren https://qoto.org/@olives/111820823469468272 I had some thoughts on that here. Random thoughts though.
@ProstasiaInc I suspect a lot of it is really just ableism so discriminating against someone on account of their mental disability.
Hyper takes, bad random takes, poor wording, jokes. Strange language / behavior getting cast as "devious". I could go on. Also, maybe trolling or messing around.
No time to get into it though.
@ProstasiaInc I'm sure there'll be a moral panic in there somewhere, and it'll probably have everything to do with online predators, however, I imagine the bigger issue is going to be things like employment, and I suppose, bullying in this case.
Since they want to grandstand in a manner which is clearly harmful to human rights, I am going to remove the very point they're vainly trying to grandstand on.
@freemo I think it might come off as being told off.
"In 2021, Canadian cybersecurity firm eQualitie launched a petition to have the 2024 forum in Montreal. Dozens of tech companies and civil society organizations from Canada and around the world signed on to the petition, but the Canadian government appears to have ignored the request. A spokesperson for Canadian foreign affairs minister Mélanie Joly did not return a request for comment."
#Canada seems like an alright choice for the #IGF. Alright, a few proposals from politicians lately do seem like splinternet material (that's not good), but it also doesn't have Saudi Arabia's human rights record.
Though, I'm sure these are probably not the only two possible options.
It's an interesting article, although I'm not sure I like that headline.
https://www.wired.com/story/united-nations-igf-saudi-arabia-russia/
"THE UNITED NATIONS’ main internet governance body will host its next international forum in Riyadh, Saudi Arabia. In 2025, the UN may take its discussions on the future of an open internet to Russia. Holding the Internet Governance Forum (#IGF), back to back, in authoritarian countries notorious for their surveillance and #censorship of the internet risks making “a joke of the whole system,” one advocate says."
When thinking of which country I want to host a human rights conference, the first country which comes to mind is Saudi Arabia. You just can't find a country which cares more about human rights than them. #IGF
Sarcasm, obviously.
https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=27148
Even someone from the (who have shown themselves to be useless when it comes to digital rights) U.N. wrote to the U.K. to tell them the OSB violates human rights.
https://www.ohchr.org/en/special-procedures/sr-freedom-of-opinion-and-expression/comments-legislation-and-policy The U.K. didn't even bother to respond.
Honestly though, these bills themselves are a violation of human rights, and if you're thinking of bills which kill or oppress people, I think these bills could have an equivalent effect, yes.
All these people think about is some nebulous notion of "safety" that some morally bankrupt lobbyist (probably some "white knight") has pushed, not vulnerable people who have to rely on the Internet (and get hit in the name of "safetyism").
@charliejane I suspect he knows exactly what he is doing. He just doesn't care.
He keeps introducing bills which are clearly harmful, and unconstitutional, and has had a long, long time to learn. Even when he was an Attorney General, he seemed to act inconsistently with the constitution.
https://www.wired.com/story/parabon-nanolabs-dna-face-models-police-facial-recognition/
"Leaked records reveal what appears to be the first known instance of a police department attempting to use facial recognition on a face generated from crime-scene DNA. It likely won’t be the last."
"Parabon’s methods have not been peer-reviewed, and scientists are skeptical about how feasible predicting face shape even is."
"“Daisy chaining unreliable or imprecise black-box tools together is simply going to produce unreliable results,” she says."
"In a controversial 2017 decision, the department published the predicted face in an attempt to solicit tips from the public. Then, in 2020, one of the detectives did something civil liberties experts say is even more problematic—and a violation of Parabon NanoLabs’ terms of service: He asked to have the rendering run through facial recognition software."
"For facial recognition experts and privacy advocates, the East Bay detective’s request, while dystopian, was also entirely predictable. It emphasizes the ways that, without oversight, law enforcement is able to mix and match technologies in unintended ways, using untested algorithms to single out suspects based on unknowable criteria."
"“It’s really just junk science to consider something like this,” Jennifer Lynch, general counsel at civil liberties nonprofit the Electronic Frontier Foundation, tells WIRED. Running facial recognition with unreliable inputs, like an algorithmically generated face, is more likely to misidentify a suspect than provide law enforcement with a useful lead, she argues. “There’s no real evidence that Parabon can accurately produce a face in the first place,” Lynch says. “It’s very dangerous, because it puts people at risk of being a suspect for a crime they didn’t commit.”"
Software Engineer. Psy / Tech / Sex Science Enthusiast. Controversial?
Free Expression. Human rights / Civil Liberties. Anime. Liberal.