acm-header
Sign In

Communications of the ACM

ACM TechNews

Algorithms to Make Virginia Judges Fairer Have Unintended Consequences


View as: Print Mobile App Share:
capital dome and magnifying glass, illustration

George Mason University's Megan Stevenson and Texas A&M University's Jennifer Doleac explored the unintended consequences of human-algorithmic partnerships, analyzing algorithms intended to reduce Virginia prison populations by assigning defendants risk scores.

Judges would use scores to identify the least-likely reoffenders, but this did not change incarceration rates, length of sentences, or recidivism; defendants algorithmically labeled high-risk went to jail longer than they would have, while lower-risk candidates got shorter sentences. Stevenson said risk score is largely a reflection of age, with younger lawbreakers assigned higher scores than repeat adult offenders.

Younger, black defendants scored via risk assessment tended to receive harsher sentences, and sex offenders went to jail less often and got shorter sentences overall, although the algorithm was designed to only authorize longer-than-baseline sentences. The researchers think in such cases, algorithms may shield judges from backlash if the lawbreaker re-offends.

From The Washington Post
View Full Article – May Require Paid Subscription


Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account