Skip to playerSkip to main content
  • 2 years ago
In many countries, police use software that supposedly helps prevent crimes before they’re committed. Proponents say this makes cities safer. Critics say it leads to increased discrimination. How does it work?

Category

🗞
News
Transcript
00:00 At the heart of predictive policing lies a big promise, to stop crime before it is even
00:06 committed.
00:07 This is done by combing through masses of data.
00:10 Some programs focus on potential crime scenes.
00:13 If they find links between previous incidents, say that they all occurred at the same time
00:17 and place, or even during the same weather conditions, they recommend that police zero
00:21 in on those locations.
00:25 Other programs predict what they consider to be potential offenders.
00:29 They scan the criminal and personal histories of individuals for so-called risk factors,
00:34 like previous arrests, or even dropping out of school, and come up with a list of who
00:38 is likely to break the law.
00:45 Law enforcement agencies deploy these programs in over 50 countries, Western democracies,
00:51 and authoritarian governments.
00:58 Opponents of predictive policing say it makes cities safer, but the approach has caused
01:03 outrage among human rights advocates.
01:07 They warn that it increases discrimination.
01:11 The programs, for instance, often flag low-income communities or minority neighborhoods as alleged
01:17 hot spots, prompting the police to patrol those areas more than others.
01:23 This in turn generates even more data and sets off a vicious cycle of discrimination,
01:28 flagging these areas over and over again.
01:32 Similarly, the programs tend to single out low-income people and minorities as potential
01:37 offenders.
01:38 Existing police data, whether in New Delhi, whether in the US, whether in Germany, whether
01:44 in Australia, embed a lot of historical discrimination.
01:48 They do embed systemic problems such as racism or casteism or even sexism.
01:54 Now we're taking all of this data that is seemingly objective and correct, but is in
02:00 reality quite biased and discriminatory, especially towards minority populations.
02:05 We take that as being the ground truth on which the computer learns what patterns exist.
02:12 Public awareness for the risks of predictive policing and biases against some social groups
02:16 is growing.
02:18 In the US, amid a debate over systemic racism and how to reform policing, some police departments
02:25 stopped using facial recognition and predictive policing.
02:29 But in other countries, the technology is still in demand.
02:33 In countries that don't have strong data protection legislation or where governments are more
02:38 open to experimenting with technologies on people, we see that predictive policing is
Comments

Recommended