Allowing police officers to submit LLM-written reports reveals a remarkable misunderstanding of what LLMs do, a profound indifference to the notion of integrity in the communications of law enforcement with the justice system, or both.
Given how readily subject to suggestion human witnesses—including police officers—are known to be, this is a disaster.
Yes, police reports aren't always the most accurate, but introducing an additional layer of non-accountability is bad.
@quinn @ct_bergstrom Sounds like we need a landmark decision that throws out a case because of computer generated evidence or testimony. It. might slap someone awake.
@quinn @pieist @ct_bergstrom
We should have an AI provide a summary of court cases and have judges see just how unreliable they are for themselves, especially if compared to the official record written up by the steno.
@jargoggles @quinn @pieist @ct_bergstrom A good trial lawyer will do exactly that. Also there is the Daubert standard in the US for expert witnesses.
@jargoggles @quinn @pieist @ct_bergstrom
Several lawyers have submitted legal briefs written by AI - the results have not impressed the judges.
@shadowpages @quinn @pieist @ct_bergstrom
It probably would have helped if the court cases the LLM cited actually existed.
@jargoggles @quinn @pieist @ct_bergstrom Just a bit ...
@pieist @ct_bergstrom part of the problem is that judges overwhelmingly defer to technology as an expert. so that's hard.