This bodes well!
February 16: "A very strange conversation with the chatbot built into Microsoft’s search engine led to it declaring its love for me."
March 13: "Microsoft laid off its entire ethics and society team within the artificial intelligence organization as part of recent layoffs that affected 10,000 employees across the company"
https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html
https://www.platformer.news/p/microsoft-just-laid-off-one-of-its
@maxkennerly The fact that the NYT thinks language models are capable of love to declare is just sloppy reporting from people who don't understand what they're interacting with.
@LouisIngenthron This is true, but also emblematic of a core ethical problem: most people don't understand what language models do. They're thus prone to, e.g., inferring cognition, to believing the hallucinations are factual, etc.
@maxkennerly Yeah, and when TV/Radio came out, there was a subset of the population that believed whatever came out of those boxes was true. 🤷♂️
Familiarity will breed contempt.
@cuibonobaby @maxkennerly Some people prefer to be lied to than adjust their worldview, and nobody can fix that. All you can do is wait for them to die out.
But the people who don't want to be lied to will get wise to it, yes.
@LouisIngenthron
a willingness to be lied to isn't a failure limited to any particular generation, I'm afraid.
@LouisIngenthron we live in hope.