That said, someone surely doesn't need a philosopher or vague concepts / roundtables to figure out that this sounds like an awfully bad idea.
Building robust and reliable systems has been an aim of the field for decades, and a system which fails inexplicably in carrying out tasks is something that surely should be ruled out.