acm-header
Sign In

Communications of the ACM

ACM TechNews

Many Facial Recognition Systems Are Biased, Says U.S. Study


View as: Print Mobile App Share:
The error rates were highest in identifying Native Americans.

U.S. National Institute of Standards and Technology researchers found most commercial facial recognition algorithms are biased.

Credit: teguhjatipras/Pixabay

A study by U.S. National Institute of Standards and Technology (NIST) researchers found most commercial facial recognition algorithms are biased and falsely identify African-American and Asian faces 10 to 100 times more frequently than Caucasian faces.

The error rates were highest in identifying Native Americans, while identifying women was even more problematic than for men.

The researchers tested 189 algorithms from 99 developers on a law enforcement dataset containing over 18 million photos of 8.5 million people.

The facial-matching algorithms used in law enforcement had the highest error rates for African-American women.

NIST scientist Patrick Grother said he hopes these findings will encourage developers to consider ways to recognize such bias and work to remedy it.

Carnegie Mellon University's Maria De-Arteaga said, "We have to think about whether we really want these technologies in our society."

From The New York Times
View Full Article

 

Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account