People may be more willing to trust a computer program than their fellow humans, especially as a task becomes more difficult, according to researchers at the University of Georgia.
"It seems like there's a bias towards leaning more heavily on algorithms as a task gets harder and that effect is stronger than the bias towards relying on advice from other people," says Eric Bogert, a Ph.D. student and author with Professor Richard Watson and Assistant Professor Aaron Schecter of "Humans Rely More on Algorithms Than Social Influence as a Task Becomes More Difficult," published in the journal Scientific Reports.
The study is part of a larger research program into human-machine collaboration, funded by a $300,000 grant from the U.S. Army Research Office. "The eventual goal is to look at groups of humans and machines making decisions and find how we can get them to trust each other and how that changes their behavior," Schecter says.
From University of Georgia
View Full Article
No entries found