Artificial intelligence warning over human extinction labelled ‘publicity stunt’
By Jordan Reynolds
Professor Sandra Wachter said the risk raised in the letter that AI could wipe out humanity is ‘science fiction fantasy’.
The probability of a “Terminator scenario” caused by artificial intelligence is “close to zero”, a University of Oxford professor has said.
Sandra Wachter, professor of technology and regulation, called a letter released by the San Francisco-based Centre for AI Safety – which warned that the technology could wipe out humanity – a “publicity stunt”.
Professor Wachter said the risk raised in letter is “science fiction fantasy” and she compared it to the film The Terminator.
She added: “There are risks, there are serious risks, but it’s not the risks that are getting all of the attention at the moment.
“What we see with this new open letter is a science fiction fantasy that distracts from the issue right here right now. The issues around bias, discrimination and the environmental impact.
“The whole discourse is being put on something that may or may not happen in a couple of hundred years. You can’t do something meaningful about it as it’s so far in the future.
“But bias and discrimination I can measure, I can measure the environmental impact. It takes 360,000 gallons of water daily to cool a middle-sized data centre, that’s the price that we have to pay.
“It’s a publicity stunt. It will attract funding.
QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.