NTSB: Distracted Driver Caused Fatal Uber Crash

The board determined that the probable cause of the crash was the backup driver failing to monitor the road because she was distracted by watching a TV show.

This file image made from video March 18, 2018, of a mounted camera provided by the Tempe Police Department shows an exterior view moments before an Uber SUV hit a woman in Tempe, Ariz. The chairman of the National Transportation Safety Board says Uber had an ineffective safety culture when one of its autonomous test vehicles ran down and killed a pedestrian last year in Tempe, Arizona. Robert Sumwalt said at a hearing Tuesday, Nov. 19, 2019, on the March 2018 crash that Uber didn’t continually monitor its operations and it had de-activated its Volvo SUV’s automatic emergency braking system. Uber’s own system also didn’t have the ability to brake automatically, relying on a human backup driver to do the braking.
This file image made from video March 18, 2018, of a mounted camera provided by the Tempe Police Department shows an exterior view moments before an Uber SUV hit a woman in Tempe, Ariz. The chairman of the National Transportation Safety Board says Uber had an ineffective safety culture when one of its autonomous test vehicles ran down and killed a pedestrian last year in Tempe, Arizona. Robert Sumwalt said at a hearing Tuesday, Nov. 19, 2019, on the March 2018 crash that Uber didn’t continually monitor its operations and it had de-activated its Volvo SUV’s automatic emergency braking system. Uber’s own system also didn’t have the ability to brake automatically, relying on a human backup driver to do the braking.
Tempe Police Department via AP, File

The National Transportation Safety Board on Tuesday condemned the lack of state and federal regulation for testing autonomous vehicles before finding that a distracted human safety driver was the main cause of a fatal 2018 Arizona crash involving an Uber vehicle.

The board criticized the National Highway Traffic Safety Administration, the government’s road safety agency, for failing to lead in regulating tests on public roads. But it also said states need to adopt their own regulations.

“In my opinion they’ve put technology advancement here before saving lives,” NTSB member Jennifer Homendy said of NHTSA, after NTSB staff members called self-regulation inadequate. “There’s no requirement. There’s no evaluation. There’s no real standards issued.”

NHTSA has issued voluntary guidelines including safety assessment reports from autonomous vehicle companies, but only 16 have filed such reports, the NTSB said. Yet there are 62 companies with permits to do testing in California. The agency has instead avoided regulations in favor of allowing the technology to move forward because it has tremendous life-saving potential.

NTSB staffers told the board that NHTSA has no mechanism to evaluate the companies’ safety reports, and since they aren’t mandatory, few are submitting them.

The board voted to recommend that NHTSA require companies to turn in the reports and set up a process for evaluating them. NHTSA should make sure the companies have proper safeguards in place. Those would include making sure that companies show that they are monitoring vehicle operators to ensure they are paying attention during the tests, the NTSB said.

In a statement, NHTSA said it welcomes the NTSB report “and will carefully review it and accompanying recommendations.”

NHTSA said its investigation into the Tempe crash, which killed a pedestrian, is ongoing and a report will be made public when it’s finished.

The NTSB also recommended that states, including Arizona, require autonomous-vehicle companies turn in applications to test vehicles on public roads that at a minimum require a plan to manage risk and operator inattentiveness. The plans also should set countermeasures to prevent crashes or mitigate their severity.

The NTSB investigates transportation accidents but has no regulatory authority. In highway crashes, it can only make recommendations to NHTSA.

The Uber crash was the first fatality involving an autonomous test vehicle, and it reverberated through the auto industry and Silicon Valley. It forced other companies to slow what had been a fast march toward autonomous ride-hailing services on public roads.

NTSB Chairman Robert Sumwalt said at the hearing that Uber had an ineffective safety culture before the March 2018 crash on a darkened street in Tempe that killed Elaine Herzberg, 49.

Sumwalt said Uber didn’t continually monitor its operations and it had de-activated its Volvo SUV’s automatic emergency braking system. Uber’s own system also didn’t have the ability to brake automatically, relying on a human backup driver to do the braking.

He said all companies that test autonomous vehicles on public roads need to study the crash to prevent future accidents.

The board determined that the probable cause of the crash was the backup driver failing to monitor the road because she was distracted by watching a TV show on her mobile phone. Uber’s inadequate safety procedures and ineffective oversight of drivers contributed to the cause, as well as Herzberg being impaired by methamphetamines and crossing the road away from an intersection, the NTSB said.

Also contributing was the Arizona Transportation Department’s insufficient oversight of autonomous vehicle testing, the NTSB determined.

A spokesman for Arizona Gov. Doug Ducey said the state appreciates the NTSB’s work and will review the recommendations.

San Francisco-based Uber said in a statement that it deeply regrets the crash and is committed to improving safety.

The Uber system detected Herzberg 5.6 seconds before the crash. But it but failed to determine whether she was a bicyclist, pedestrian or unknown object, or that she was headed into the vehicle’s path, the NTSB said. The system also did not include a provision for detecting jaywalking pedestrians, the agency said.

Instead, Uber relied on the human operator stop the vehicle to avoid a crash. But the was looking down just before the crash. The NTSB said Uber had cameras monitoring drivers, but it didn’t do spot checks to make sure they were paying attention.

It said Uber cooperated in the investigation, listened to criticism and has made many safety improvements since the crash, including activating the braking systems, better training of human backup drivers, adding a second driver and hiring a safety director.

More in Product Development