acm-header
Sign In

Communications of the ACM

ACM TechNews

Should Your Driverless Car Hit a Pedestrian to Save Your Life?


View as: Print Mobile App Share:
A pedestrian crosses in front of a vehicle at Mcity, a test site for driverless vehicles on the University of Michigan campus in Ann Arbor.

A new research study indicates that what people really want to ride in is an autonomous vehicle that puts its passengers first.

Credit: Paul Sancya/Associated Press

Most people believe self-driving vehicles should ultimately put their passengers' lives first, according to a new study, a finding that poses an ethical dilemma for developers of autonomous cars who must code moral decisions within a machine.

The study involved polling U.S. residents last year concerning how they thought autonomous autos should behave. Although respondents generally felt such cars should make decisions for the greatest good, when presented with scenarios in which they had to choose between saving themselves or saving pedestrians, the respondents chose the former.

"One missing component has been the empirical component: what do people actually want?" says Massachusetts Institute of Technology researcher Iyad Rahwan.

One of the six polls conducted found respondents generally hesitant to accept government regulation of artificial intelligence algorithms, even if it could address the driver-versus-pedestrian dilemma.

The researchers say the study could present a legal and philosophical morass for autonomous-vehicle makers, especially concerning the issue of accountability. "If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm's decisions?" the researchers ask.

From The New York Times
View Full Article - May Require Free Registration

 

Abstracts Copyright © 2016 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account