A study by U.S. National Institute of Standards and Technology (NIST) researchers found most commercial facial recognition algorithms are biased and falsely identify African-American and Asian faces 10 to 100 times more frequently than Caucasian faces.
The error rates were highest in identifying Native Americans, while identifying women was even more problematic than for men.
The researchers tested 189 algorithms from 99 developers on a law enforcement dataset containing over 18 million photos of 8.5 million people.
The facial-matching algorithms used in law enforcement had the highest error rates for African-American women.
NIST scientist Patrick Grother said he hopes these findings will encourage developers to consider ways to recognize such bias and work to remedy it.
Carnegie Mellon University's Maria De-Arteaga said, "We have to think about whether we really want these technologies in our society."
From The New York Times
View Full Article
Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA
No entries found