Allowing police officers to submit LLM-written reports reveals a remarkable misunderstanding of what LLMs do, a profound indifference to the notion of integrity in the communications of law enforcement with the justice system, or both.

Given how readily subject to suggestion human witnesses—including police officers—are known to be, this is a disaster.

Yes, police reports aren't always the most accurate, but introducing an additional layer of non-accountability is bad.

apnews.com/article/ai-writes-p

It's a terrifying development.

LLMs are literally designed to generate *plausible-sounding* *bullshit*.

They have no accountability and even less allegiance to truth than crooked cops—but they will be much, much better at writing the kinds of falsehoods that will bring a conviction.

Follow

@ct_bergstrom It's bad enough that juries can be willing to be led to the easiest conclusion; the one that doesn't require setting aside preconceptions and really thinking about it. LLMs do that by design, they actually can't do anything else.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.