acm-header
Sign In

Communications of the ACM

ACM News

The Road to Regulating Self-Driving Cars Is Long, Winding


View as: Print Mobile App Share:
A banged-up Google self-driving car.

A 2012 Lexus RX450h SUV that Google was using to test its self-driving technology, after its run-in with a bus.

Credit: NBCNews.com

As automobile accidents go, a 2-mile-per-hour sideswipe typically would not reach the level of having an insurance company pay the repair bill, let alone set off a policy discussion in the highest reaches of government. Yet a recent "fender bender" between an autonomous vehicle being tested by Google and a municipal bus in Mountain View, CA—the first incident in which the autonomous vehicle itself was acknowledged as being at fault for the mishap—has re-ignited the debate over how quickly these vehicles should be deployed in real-world traffic, how best to evaluate when the vehicles will be ready for that move, and whether or not U.S. federal regulators have enough expertise or resolve to lead those evaluation efforts.

According to one member of the scientific community familiar with autonomous vehicle technology, the amorphous situation about exactly what autonomous vehicle developers will have to do to prove their technology is safe enough for real-world deployments is just one example of a regulatory ecosystem sorely lacking in expertise. Mary "Missy" Cummings, director of the Humans and Autonomy Lab at Duke University, said the U.S. National Highway Traffic Safety Administration (NHTSA), which is tasked with developing national guidelines for the cars, is lacking in technical leadership.

"What concerns me is they don't have the people who even would understand whether or not the car companies or even a third-party testing agency was doing it the right way," Cummings said, adding she has similar concerns about technical staff at other U.S. federal agencies, including the Federal Aviation Administration and the Department of Defense.

"It really speaks to an overall concern with the government," she said. "If we don't have enough people in the government who can tell you how a machine learning algorithm works, if you can't even find somebody there who can explain to you the basics behind that and can speak coherently with the architect of that kind of system, how will they know whether the testing is correct or not?"

Who leads?

California Department of Motor Vehicles (DMV) spokeswoman Jessica Gonzalez, while recognizing federal standards on testing and deployment would supersede state regulations, said the state's pioneering work in establishing autonomous vehicle standards could be taken as a possible way forward on a national scale. California, which has been working on autonomous vehicle regulation since 2012, currently holds the chair of the American Association of Motor Vehicle Administrators. Gonzalez said, "We are currently working with other states to create some guidelines for NHTSA to look at. California being so far ahead in this, we do hope NHTSA looks at us and says 'it's a good model.'"

One of the California draft requirements for autonomous vehicle deployment calls for both manufacturer and third-party testing of the vehicles. Gonzalez said a possible way forward for third-party testing bodies is that they be fully independent of both the DMV and the developers, with a presence in California, and that more than one body may be contracted to do the testing.

Tim Austin, president of the National Society of Professional Engineers (NSPE),  has offered his organization's expertise in that area at both state and national levels since, as Austin said, the vehicles' capabilities must be tested within the context of the "overall transportation system. As long as we've had automobiles, we've had engineers who are designing bridges, road systems, safe zones, traffic control, and things of that nature that help those vehicles perform safely."

Regarding federal regulators, Cummings said, "There's no question they're going to have to farm out some of the testing. They simply don't have the people or the resources in place to be able to conduct testing. And I'm not necessarily even advocating they need to have the internal resources; what concerns me is they don't have the people who even would understand whether or not the car companies, or even the third-party testing agency, was doing it the right way."

Defining 'driver'

A central point of debate in the future of autonomous vehicles is exactly what kind of safety fallback mechanism will be expected to take control of the car if the first-line onboard navigation and control technology fails. Google, which has been leading autonomous vehicle testing in the U.S., has lobbied strongly for fully autonomous vehicles, in which no human-controlled technology such as a steering wheel or brake pedal is provided. However, the California draft regulations explicitly require a licensed driver "capable of taking control in the event of a technology failure or other emergency" be present in the vehicle—and the definition of what constitutes a "driver" will undoubtedly be the subject of discussion in coming months.

"Last December, we were disappointed that California released draft regulations for operation of autonomous vehicles that specifically excluded fully self-driving cars, despite strong public support for this technology, particularly from the disability community," Chris Urmson, director of Google's self-driving car program, testified at a March 15 hearing before a U.S. Senate commerce committee (Google representatives had no further comment for this story).

Santa Monica, CA-based advocacy group Consumer Watchdog has taken the lead in demanding autonomous vehicle developers make all video and technical data public and transparent, whether the data is controlled by neutral agencies or by the technology developers themselves. John Simpson, who leads the group's efforts on autonomous vehicle regulation, said, "I think California has it exactly right" in requiring a licensed operator.

Simpson has spared no effort in demanding NHTSA demonstrate stronger leadership in establishing testing and deployment standards in full public view. For instance, in a March 3 letter to the agency, Consumer Watchdog, along with Consumers Union, the Center for Auto Safety, Consumers for Auto Reliability and Safety, and former NHTSA Administrator Joan Claybrook called for NHTSA "to commit to maximum transparency and public involvement."

NHTSA, in turn, has responded in recent weeks with a flurry of activity: it announced a day-long public meeting on autonomous vehicle technology in Washington, DC, on April 8, and another to be held in California at an unspecified date. In a letter to Simpson, NHTSA administrator Mark Rosekind said, "We are enhancing our public engagement strategy and will announce specific details in the coming weeks."

NHTSA also released an initial review of current Federal Motor Vehicle Safety Standards (FMVSS) that identifies key challenges in full deployment of automated vehicles, in scenarios that include both a human driver capable of taking control and a fully autonomous situation. In essence, the document's authors said there are few barriers for automated vehicles to comply with FMVSS as long as the vehicle does not significantly diverge from a conventional vehicle design, while automated vehicles that begin to push the boundaries of conventional design (such as alternative cabin layouts or omission of manual controls) would be constrained by the current FMVSS or may conflict with policy objectives of the FMVSS.

Cummings and Simpson cautioned there remain many undefined x-factors to be addressed, regardless of autonomous vehicle design. In her testimony in the Senate hearing, Cummings mentioned vehicle operation in bad weather, including standing water on roadways, drizzling rain, sudden downpours, and snow.

"These limitations will be especially problematic when coupled with the inability of self-driving cars to follow a traffic policeman’s gestures," she said.

The Mountain View crash was a prime example of such an amorphous situation; the Google car was halted for construction and when it attempted to re-renter traffic, it believed a bus would yield to it, since it reached the merge point first. The bus did not.

Google addressed the incident in its monthly project report for February and said its engineers had "refined" the car's software in response:

"This is a classic example of the negotiation that’s a normal part of driving—we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

"We’ve now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. Our cars will more deeply understand that buses and other large vehicles are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future."

Simpson emphasized the process of introducing autonomous vehicles on a wider scale, through both subsequent testing and eventual deployment, will entail a lengthy period in which autonomous vehicles and those under human control will need to coexist safely in conditions similar to those leading up to the accident, and there can be no confidential data, nothing withheld.

"The point is, they are using our public highways as essentially their private laboratory, and that puts special responsibility on Google and their supporters," Simpson said.

Gregory Goth is an Oakville, CT-based writer who specializes in science and technology.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account