![](https://d15shllkswkct0.cloudfront.net/wp-content/blogs.dir/1/files/2018/03/uberdeath.png)
![](https://d15shllkswkct0.cloudfront.net/wp-content/blogs.dir/1/files/2018/03/uberdeath.png)
A final report by the National Transportation Safety Board into the fatal crash involving an Uber Technologies Inc. self-driving vehicle in Tempe, Arizona, in March 2018 has found that the safety driver was primarily at fault but also criticized other aspects of the company’s self-driving car testing program.
Interim findings leaked earlier this month found that Uber, despite pushing a narrative that victim has been killed from walking into the path of the vehicle, was deficient because its software had not been programmed to detect jaywalkers. The final findings, presented by the NTSB at a hearing Tuesday found that the technology played a role, but so did the driver, the victim and even government regulations of autonomous vehicles.
In a 3-0 vote, the NTSB ruled that the most probable cause of the accident was the failure of the back-up safety driver to monitor the driving environment “because she was visually distracted throughout the trip by her personal cell phone.” It had previously been found that the driver had been streaming “The Voice” on her phone at the time of the crash. The NTSB found that she had spent 34% of her time in the vehicle glancing at her phone and last did so six seconds before the crash.
Uber itself also came in for criticism for having “inadequate safety risk assessment procedures.” The NTSB found that Uber did not have a dedicated safety manager in place at the time of the accident and that its monitoring of vehicle operators was ineffective. Uber was also called out for not addressing “automation complacency” wherein safety drivers were not paying attention because they became complacent while operating the vehicles.
The deficiencies in Uber’s technology were cited as well. The NTSB found that the automated system had not been programmed to detect jaywalkers and precluded braking in emergency situations when a crash was inevitable. The board also found that Uber’s decision to deactivate forward-collision warnings and automatic emergency braking that were standard on the Volvo test vehicle played a role.
The victim, 49-year-old Elaine Herzberg, was also said to have played her part since she was high on meth when she walked into the path of the Uber vehicle.
The state of Arizona and the National Highway Traffic Safety Administration came in for criticism too. The NTSB found that neither had sufficient guidelines or policies to regulate self-driving vehicles adequately on public roads. The report called on the NHTSA to require companies testing autonomous vehicles to submit a safety assessment report to the agency as well as implementing a plan for evaluating those reports.
“The collision was the last link of a long chain of actions and decisions made by an organization that unfortunately did not make safety the top priority,” NTSB Chairman Robert Sumwalt said. “The inappropriate actions of both the automatic driving system as implemented and the vehicle’s human operator were symptoms of a deeper problem. The ineffective safety culture that existed at the time [at Uber].”
THANK YOU