Every #tool is dangerous, and the more powerful the tool the more dangerous it is. Of course. Is #AI as dangerous as #nuclear #weapons? Probably not. It might be in the same league as, oh, say #internal #combustion #engines—and those have done a hell of a lot of damage. But they haven't done it by ushering in the #apocalypse. Instead the damage is from slow, creeping, cumulative change where the effect of any one individual event is too small to measure.
So I really think the focus on world-ending scenarios takes away from the conversations we need to be having. This reminds me a lot of the simmering "how far is too far" #genetics debate, especially the kibitzing from "#ethicists" with no understanding of the #biology and an #ethical sense that isn't nearly as developed as they think it is. There are conversations on that topic I'd like to have without the constant Greek chorus of "#Frankenstein! #Gattaca! #JurassicPark!"