acm-header
Sign In

Communications of the ACM

ACM News

Untold History of AI: Algorithmic Bias Was Born in the 1980s


View as: Print Mobile App Share:
St. Georges Hospital Medical School in London, where an algorithm to screen student applications for admission was found to be biased.

A medical school in the U.K. thought a computer program would make the admissions process fairer, but it did just the opposite.

Credit: Matthew Fearn/PA Images/Getty Images

The history of AI is often told as the story of machines getting smarter over time. What's lost is the human element in the narrative, how intelligent machines are designed, trained, and powered by human minds and bodies.

In this six-part series, we explore that human history of AI—how innovators, thinkers, workers, and sometimes hucksters have created algorithms that can replicate human thought and behavior (or at least appear to). While it can be exciting to be swept up by the idea of superintelligent computers that have no need for human input, the true history of smart machines shows that our AI is only as good as we are.

Part 5: Franglen's Admissions Algorithm

In the 1970s, Dr. Geoffrey Franglen of St. George's Hospital Medical School in London began writing an algorithm to screen student applications for admission.

At the time, three-quarters of St. George's 2,500 annual applicants were rejected by the academic assessors based on their written applications alone, and didn't get to the interview stage. About 70 percent of those who did make it past the initial screening went on to get offered places at the medical school. So that initial "weeding out" round was crucial.

Franglen was the vice-dean of St. George's and himself an admissions assessor. Reading through the applications was a time-consuming task that he felt could be automated. He studied the processes by which he and his colleagues screened students, then wrote a program that, in his words, "mimic[ked] the behavior of the human assessors."

While Franglen's main motivation was to make admissions processes more efficient, he also hoped that it would remove inconsistencies in the way the admissions staff carried out their duties. The idea was that by ceding agency to a technical system, all student applicants would be subject to precisely the same evaluation, thus creating a fairer process.

In fact, it proved to be just the opposite.

 

From IEEE Spectrum
View Full Article

 


 

No entries found