I would like to see a law which makes companies responsible for any errors their AI makes. Since their key appeal is for passing accountability to a black hole, I think it ought to transfer to them instead.

If they had to manage that risk I think we would see more responsible use.

arstechnica.com/science/2023/1

Follow

@Haste Wouldnt they already be held accountable for that? If a companies software makes an error they are accountable to it.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.