Skip to playerSkip to main content
  • 10 hours ago
Transcript
00:00Federal judge flags AIUs in police reports.
00:04A federal judge is raising alarms over the growing use of artificial intelligence in law enforcement paperwork,
00:10warning that the technology could lead to inaccuracies and undermine trust in police reporting.
00:16According to the Associated Press, U.S. District Judge Sarah Ellis highlighted the issue in a brief but notable footnote within a 223-page opinion released last week.
00:25Ellis noted that at least one immigration agent reportedly asked ChatGPT to generate a use of force report after submitting only a short description and several photos.
00:36A shortcut, she suggested, may explain the inaccuracy of these reports.
00:41She also flagged inconsistencies between body camera footage and the official narrative.
00:46Experts say the practice is deeply concerning.
00:49Ian Adams, an assistant criminology professor at the University of South Carolina and member of the Council on Criminal Justice AI Task Force,
00:57said the reported approach was the worst of all worlds.
01:01Giving it a single sentence and a few pictures, if that's true, goes against every bit of advice we have out there.
01:09It's a nightmare scenario, Adams said.
01:11Very few police agencies have established clear policies governing AI use, especially for critical documents like use-of-force reports.
01:20Adams emphasized that courts rely heavily on an officer's first-hand account when determining whether their actions were justified.
01:28We need the specific articulated events and the specific thoughts of that officer to know if the use of force was reasonable, Adams said.
01:36Privacy issues further complicate matters.
01:38Katie Kinsey, chief of staff at the Policing Project at NYU School of Law, warned that uploading images or sensitive details into public AI systems could inadvertently expose data to misuse.
01:51Agencies, she said, should first understand the risks and establish guardrails before adopting such tools.
01:57Some companies, including Axon, now offer AI-assisted tools integrated with body cameras.
02:02However, these systems generally focus on audio transcription rather than visual interpretation, a task experts say current AI still struggles to perform accurately.
02:14Andrew Guthrie Ferguson, a law professor at George Washington University, cautioned that AI-generated narratives risk misrepresenting events entirely.
02:22It's about what the model thinks should have happened, not what actually happened, Andrew said.
02:27You don't want that showing up in court to justify your actions.
02:31As law enforcement increasingly considers AI adoption, experts stress the need for strict policies, transparency, and well-defined safeguards to protect both officers and the public.
Be the first to comment
Add your comment

Recommended