acm-header
Sign In

Communications of the ACM

ACM TechNews

Helping Autonomous Vehicles Make Moral Decisions by Ditching the Trolley Problem


View as: Print Mobile App Share:
Considering the trolley problem in a modern context.

Dario Cecchini at North Carolina State University said drivers have to make more realistic moral decisions every day than the trolley problem presents. "Should I drive over the speed limit? Should I run a red light? Should I pull over for an ambulance?"

Credit: Cynthia Kumaran

In an effort to gather data to train autonomous vehicles to make "good" decisions, North Carolina State University (NC State) researchers are working to collect more realistic data on moral challenges in low-stakes traffic situations.

The researchers developed seven driving scenarios, then created eight versions of each scenario with different combinations of agents (the person's character or intent), deed (their actions), and consequence (the outcome); and programmed them into a virtual reality environment.

Said NC State's Dario Cecchini, "The goal here is to have study participants view one version of each scenario and determine how moral the behavior of the driver was in each scenario, on a scale from one to 10. This will give us robust data on what we consider moral behavior in the context of driving a vehicle, which can then be used to develop [artificial intelligence] algorithms for moral decision making in autonomous vehicles."

From NC State University News
View Full Article

 

Abstracts Copyright © 2023 SmithBucklin, Washington, D.C., USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account