Skip to playerSkip to main content
  • 7 hours ago
A new AI system used by the Met Police to analyse existing officer data has raised concerns about privacy, trust and workplace monitoring. Staff groups say it could affect morale, while the force says it is improving standards and accountability.
Transcript
00:00The Metropolitan Police is facing criticism over a pilot AI scheme developed by US firm
00:06Palantir, used to organize existing data from officers' devices and flag potential misconduct.
00:12Staff representatives have described the tool as intrusive, warning it could affect trust
00:17and morale.
00:18The Met says the system uses lawfully held information to identify risks and support
00:23higher professional standards, following concerns about policing behavior and culture.
00:28Within the first week, the tool reportedly flagged hundreds of potential breaches, including
00:34allegations leading to arrests and suspensions, while police staff groups consider possible
00:39legal action over privacy and data protection's concerns.
Comments

Recommended