acm-header
Sign In

Communications of the ACM

ACM TechNews

An Algorithm That Grants Freedom, or Takes It Away


View as: Print Mobile App Share:
Cells in a prison.

Software is making probation decisions in the U.S. and Europe, predicting whether teens will commit crimes. Opponents want more human oversight.

Credit: krystiano

Local authorities in the U.S. and Europe use predictive governance algorithms to assess people's risk of criminality, and base probation, jail time, and other decisions on such evaluations.

Critics warn this eliminates humans and transparency from decision-making, and developers are not legally bound to explain their programs' mechanisms—while their gender, class, race, or geographical prejudices may be embedded within the algorithms as well.

Predictive algorithms calculate a likelihood of future behavior according to historical data, based on statistical risk determination.

The University of Pennsylvania's Richard Berk, who created algorithms used by Philadelphia's criminal courts to predict recidivism for probation decisions, compared algorithms to automatic pilots. Said Berk, "Automatic pilot is reliable, more reliable than an individual human pilot. The same is going to happen here."

From The New York Times
View Full Article - May Require Paid Subscription

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account