Allowing police officers to submit LLM-written reports reveals a remarkable misunderstanding of what LLMs do, a profound indifference to the notion of integrity in the communications of law enforcement with the justice system, or both.

Given how readily subject to suggestion human witnesses—including police officers—are known to be, this is a disaster.

Yes, police reports aren't always the most accurate, but introducing an additional layer of non-accountability is bad.

apnews.com/article/ai-writes-p

@ct_bergstrom The problem of computer generated evidence has been terrible, and AI just makes a bad thing worse. But the fundamental problem is that police and prosecutors are not accountable for accuracy or honesty.

Follow

@quinn @ct_bergstrom Sounds like we need a landmark decision that throws out a case because of computer generated evidence or testimony. It. might slap someone awake.

@pieist @ct_bergstrom part of the problem is that judges overwhelmingly defer to technology as an expert. so that's hard.

@quinn @pieist @ct_bergstrom
We should have an AI provide a summary of court cases and have judges see just how unreliable they are for themselves, especially if compared to the official record written up by the steno.

@jargoggles @quinn @pieist @ct_bergstrom A good trial lawyer will do exactly that. Also there is the Daubert standard in the US for expert witnesses.

@jargoggles @quinn @pieist @ct_bergstrom

Several lawyers have submitted legal briefs written by AI - the results have not impressed the judges.

@shadowpages @quinn @pieist @ct_bergstrom
It probably would have helped if the court cases the LLM cited actually existed.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.