acm-header
Sign In

Communications of the ACM

ACM News

Rethinking Autonomous Vehicles


View as: Print Mobile App Share:
U.S. National Transportation Safety Board investigators examine a self-driving Uber vehicle involved in a fatal accident in Tempe, AZ, in March.

Nearly three-quarters of Americans are afraid to ride in self-driving cars, according to the latest survey by the American Automobile Association.

Credit: National Transportation Safety Board

There is bad news ahead for the many automobile and technology companies currently developing, and road-testing, self-driving cars: many people are too frightened to ride in driverless vehicles.

The American Automobile Association (AAA) May consumer trust survey on autonomous vehicles (AVs) found that 73% of U.S. citizens now fear traveling in an AV, compared with 63% just six months before. In addition, the survey found that two-thirds of millennials—a supposedly tech-loving generation—are also too fearful to ride in self-driving cars.

The AAA even has unwelcome news from pedestrians and cyclists, with nearly two-thirds saying they don't trust AVs enough to use roads and sidewalks alongside them.

Consumer reactions like these leave nascent driverless automakers and operators like Waymo, Lyft, Uber, and GM Cruise facing a crisis of credibility.

It was not supposed to be this way. By predicting and avoiding crashes, the promise was that AVs peppered with cameras, LiDAR, radar, and ultrasound sensors, and running artificial intelligence (AI) to decipher it all, would be the key to reducing the 1.25 million fatalities per year on the world's roads, as well as saving nations the billions of healthcare dollars spent each year on patching up those injured in auto-related accidents.

AVs are also aimed improving urban mobility, especially for the elderly and people with disabilities, and at putting an end to traffic jams, and the need for parking meters.

So what's to be scared of?

In a word: death.

A driverless car accident in March, and others involving semi-automated cars that are already on the market, have tainted consumer sentiments regarding autonomous vehicles. The fatal accident took place in Tempe, AZ, where a woman was run down and killed by a self-driving Uber Technologies vehicle as she pushed her bicycle across a road.  The sensors on the car, a Volvo XC90 sport-utility vehicle (SUV) that had been modified to run Uber's in-development self-driving software, had actually detected the victim six seconds before impact, but neither the safety driver monitoring the trip inside the vehicle, or Uber's software, reacted to brake the vehicle.

While an investigation by the U.S. National Transportation Safety Board (NTSB) continues, that body reported that emergency braking maneuvers had not been "enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior."

The industry needs to address some fundamental issues to boost consumer confidence in autonomous vehicles, says Martyn Thomas, an engineer specializing in safety-critical systems at Gresham College in the U.K., and a director of British government safety watchdog the Health and Safety Executive.

While the NTSB has yet to produce a final report on the Tempe crash, Thomas says that the fact the SUV's autonomous emergency braking (AEB) system was disabled to provide a smoother ride is of serious concern. "It points to a weakness in the technology, suggesting that the self-driving car can be pushed into repeatedly going into braking mode by other road users who can cut in on them all the time to their own advantage, providing riders uncomfortable, jerky journeys."

Thomas suggests driverless car developers must agree on precise requirements for the ways autonomous vehicles should behave, so vehicles from different automakers will smoothly and safely interact with each other and other road users. Only then, he says, can engineers begin to assure the safety of those in the vehicle, and those outside it.

The AAA, meanwhile, wants to see the industry come up with "a common-sense classification system" for driverless car functions, so regulators can usefully compare—and test—the performance characteristics of future autonomous vehicles.

Thomas also questions how safety drivers, and the drivers of any future AVs in which they may be expected to intervene ahead of full autonomy, can be expected to apply the necessary level of focus on the controls, and enough situational awareness, when suddenly called upon to do so.  "The notion that you can have bored drivers sitting there, monitoring the technology, and then taking control whenever the system gets into a situation it cannot handle, is never going to succeed. It does not work in the aviation industry, and there's no reason to believe it will work in cars," he says.

A 2017 study by the University of Southampton in the U.K. backs that view: researchers found that, in an AV simulator, the amount of time needed to reassume manual control of an automated vehicle varied too widely to guarantee safety, with different people taking anything from two to 26 seconds to assume control.

Even partial automation is dangerous.

It is this issue of driver attention—or lack of it—that is also boosting the consumer fear factor measured by the AAA: in two horrific accidents, the drivers of partially automated Tesla vehicles were both killed.

The first fatality occurred in May 2016, when Joshua Brown of Canton, OH, was killed after his Tesla Model S crashed into a semi-trailer truck that neither he nor his car's sensors had seen crossing his path while driving in the car's semi-automated 'Autopilot' mode near Williston, FL.

Then, in March of this year, Tesla Model X driver Walter Huang was killed in a 70-m.p.h. Autopilot-controlled crash when the vehicle steered into a lane-dividing barrier on U.S. Highway 101 in Silicon Valley, according to a preliminary NTSB report.

Autopilot performs functions for the driver including lane-keeping, lane-changing, adaptive cruise control, and autonomous emergency braking, but the driver has to be ready to reassume control at all times with his/her hands on the steering wheel (and experiencing escalating audible and visual alarms if they take their hands from the wheel).

"The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of," said a Tesla spokesperson.

However, some safety watchdogs think even partial automation may encourage some drivers to believe their cars are fully autonomous. In early June, the U.K.-based automotive crash-test lab Thatcham Research, which is funded by the U.K. auto insurance industry, issued a report criticizing advanced driver assistance systems (ADAS) like Tesla's Autopilot and Nissan's ProPilot, which are advertised as allowing cars to "do more and more driving on behalf of motorists."

Thatcham says software products like Autopilot and ProPilot have "deeply unhelpful" names, since they do not actually 'pilot' a car autonomously for the driver. "Absolute clarity is needed here, to help drivers understand when and how these technologies are designed to work and that they should always remain engaged in the driving task," says Matthew Avery, Thatcham's head of research.

"We are starting to see real-life examples of the hazardous situations that occur when motorists expect the car to drive and function on its own, but motorists may not be sufficiently aware that they are still required to take back control in problematic circumstances," Avery says.

Tesla disagrees: "We have has always been clear that Autopilot doesn't make the car impervious to all accidents, and the issues described by Thatcham won't be a problem for drivers using Autopilot correctly," says the firm's spokesperson.

For safety-critical systems engineers like Thomas, however, the notion that drivers can be ready to jump back into full control of a vehicle, with full situational awareness, is a vain hope. "This is why some companies are already saying that they want to jump straight to making completely autonomous cars, rather than get involved in partial automation at all," he says.

Yet there are technical issues to be overcome before fully autonomous cars can be safe enough to start reducing the AAA's fear figures. For instance, driving is turning out to be far from a straightforward "narrow AI" task, says Thomas. "It's narrower than general AI, but it's a very tricky machine learning task nonetheless. Some of the problems it will face are currently unsolved, such as having autonomous cars recognize hand gestures from a police officer."

Object recognition is also an ongoing challenge. "Adversarial AI" researchers have already shown that simple arrangements of stickers placed on roadside stop signs can game how they are recognized, making them appear to pattern-recognizing deep neural networks like a speed limit sign, for example.

Such unsolved problems have some in the auto industry urging caution about promising the arrival of fully driverless vehicles any time soon.

"We're still very much in the early days of making self-driving cars a reality," says Bryan Salesky, CEO of Pittsburgh, PA-based Argo AI, a Ford-backed maker of computer vision and machine learning systems for AVs.  "Those who think fully self-driving vehicles will be ubiquitous on city streets months from now, or even in a few years, are not well-connected to the state of the art or committed to the safe deployment of the technology," Salesky writes in a blog post explaining the tough sensing and software challenges autonomous cars still face.

To continue to jump the gun and launch AVs for consumers before they are fully vetted would be unwise, says Greg Brannon, AAA's director of Automotive Engineering, because "Any incident involving an autonomous vehicle is likely to shake consumer trust, which is critical to their widespread acceptance."

The public expects AVs to be demonstrably four to five times safer than a human-driven car before they are launched, according to a recent study by the Society for Risk Analysis.

So, when might we actually get to buy, rent, or hail a fully driverless car?

Tom Flischer, a spokesman for Thatcham Research, says based on the intelligence his firm gathers from automakers on behalf of the car insurance industry, the "early to mid 2020s" is currently the most likely timeframe for AVs to start hitting the market, with fully autonomy likely not arriving until late in that period.

Cars, however, are unlikely to be the first autonomous vehicles we will see, as trials of 18-wheel autonomous trucks have begun in the U.S. and will begin soon in Europe. The task of ferrying freight between suburban and rural loading yards should be far easier to negotiate than urban traffic, so we can probably expect to see big rigs going driverless long before consumer vehicles.

Paul Marks is a technology journalist, writer, and editor based in London, U.K.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account