acm-header
Sign In

Communications of the ACM

ACM TechNews

How Social Bias Creeps Into Web Technology


View as: Print Mobile App Share:
software bias, illustration

Credit: The Wall Street Journal

Predictive and decision-making Web technologies are susceptible to the unconscious social biases of their designers and programmers, to a degree that can reflect those prejudices in their functions. An example is Google's ad-targeting system, which gave male users a higher probability of being shown ads for high-paying jobs than female users, according to a recent study. "Computers aren't magically less biased than people, and people don't know their blind spots," says data scientist Vivienne Ming.

Machine-learning software is especially vulnerable to bias, according to Andrew Selbst, co-author of an upcoming study on the phenomenon. Such programs learn from a limited set of training data and then refine their knowledge based on real-world data and experience, adopting and often amplifying biases in either data set. Selbst says compounding the difficulty of tracing bias to the source so it can be corrected is the proprietary nature of most software and the complexity of the algorithm used by the computer.

Data scientists offer a way to minimize software bias by building more diversity into statistical models. Meanwhile, Selbst and other researchers are leading efforts that seek to localize and mitigate underlying causes of software bias.

From The Wall Street Journal
View Full Article – May Require Paid Subscription

 

Abstracts Copyright © 2015 Information Inc., Bethesda, Maryland, USA


 

No entries found