acm-header
Sign In

Communications of the ACM

ACM News

Access Denied: Faulty Automated Background Checks Freeze Out Renters


View as: Print Mobile App Share:
Facing algorithmic bias in rental decisions.

Automated reports on potential renters are usually delivered to landlords without a human ever glancing at the results to see if they contain obvious mistakes, according to court records and interviews.

Credit: Andrea Ucini

Burglary and domestic assault in Minnesota. Selling meth and jumping bail in Kentucky. Driving without insurance in Arkansas. Disorderly conduct. Theft. Lying to a police officer. Unspecified "crimes." Too many narcotics charges to count.

That's what the landlord for an apartment in St. Helens, Ore., saw when he ran a background check for Samantha Johnson, a prospective tenant, in 2018.

But none of the charges were hers.

The growing data economy and the rise of American rentership since the 2008 financial crisis have fueled a rapid expansion of the tenant screening industry, now valued at $1 billion. The companies produce cheap and fast—but not necessarily accurate—reports for an estimated nine out of 10 landlords across the country.

The automated background check for Johnson cast a wide net, looking for negative information from criminal databases even in states where she had never lived and pulling in records for women whose middle names, races, and dates of birth didn't match her own. It combined criminal records from five other women: four Samantha Johnsons, and a woman who had used the name as an alias—even though the screening report said she was an "active inmate" in a Kentucky jail at the time.

 

From The Markup
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account