acm-header
Sign In

Communications of the ACM

ACM TechNews

Field-Data Study Finds No Evidence of Racial Bias in Predictive Policing


View as: Print Mobile App Share:
police officer, illustration

A randomized controlled trial of predictive policing was conducted in three divisions of the LAPD.

Credit: LinkedIn

Indiana University-Purdue University Indianapolis (IUPUI) research suggests predictive policing does not lead police to make discriminatory arrests. George Mohler, associate professor of computer and information science in the School of Science at IUPUI, worked with researchers at the University of California-Los Angeles and Louisiana State University to conduct the study in conjunction with the Los Angeles Police Department.

Working with real-time field data, both a human analyst and an algorithm made predictions on where officers would patrol each day. A random determination decided which set was used daily by officers, and the researchers measured arrest rates of ethnic groups. "When we looked at the data, the differences in arrest rates by ethnic group between predictive policing and standard patrol practices were not statistically significant," Mohler says.

However, predictive policing is a nascent field and Mohler says police departments should continue to monitor the ethnic impact of algorithms to check for racial bias.

From Indiana University
View Full Article

 

Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account