By The Washington Post · Michael Laris
The NTSB said federal officials should require, then evaluate, safety assessments from companies that are testing, or considering testing, self-driving systems on public streets. Submission of such assessments is now voluntary.
Those and other recommendations came 20 months after an self-driving Uber Volvo XC90 struck and killed 49-year-old Elaine Herzberg as she walked a bike across a street in Tempe, Arizona.
NTSB board member Jennifer Homendy, who was appointed to the independent agency by President Donald Trump, said the federal government has at times taken a "laughable" approach that relies on companies electing to hand over safety information to federal regulators.
The National Highway Traffic Safety Administration has "put technology advancements here before saving lives," Homendy said, mocking voluntary federal autonomous vehicles guidance, once dubbed "A vision for safety," as "a vision for lax safety."
"There's a big difference between the words, 'should,' 'encourage,' and 'shall,' " Homendy said. "There's a major failing on the federal government's part."
In a statement, NHTSA said it welcomes the analysis and will review the NTSB's recommendations. "While the technology is rapidly developing, it's important for the public to note that all vehicles on the road today require a fully attentive operator at all times," the agency said.
The NTSB also recommended that Arizona officials should "require developers to submit an application" for testing cars with automated driving systems. Such an application should "detail a plan to manage the risk associated with crashes and operator inattentiveness and establish countermeasures to prevent crashes or mitigate crash severity" within testing plans, the NTSB said.
It said other states should do the same.
At the federal level, the NTSB said U.S. officials should evaluate safety assessments once they are required. NHTSA should "determine whether the plans include appropriate safeguards for testing a developmental automated driving system on public roads, including adequate monitoring of vehicle operator engagement, if applicable."
The NTSB said the probable cause of the March 2018 crash that killed Herzberg was the failure of Uber's backup driver, Rafaela Vasquez, "to monitor the driving environment and the operation of the automated driving system because she was visually distracted throughout the trip."
Also contributing to the crash, the NTSB said, were several major shortfalls on Uber's part, including the company's inadequate assessment of risk and its ineffective oversight of backup drivers who were susceptible to becoming distracted and over-reliant on imperfect technologies.
Vasquez's smartphone was streaming NBC's "The Voice," and she had looked down inside the vehicle numerous times before the SUV struck Herzberg.
Other contributing factors, the NTSB said, included the fact that Herzberg was impaired, with high levels of methamphetamine in her blood, as she crossed the road; and the "insufficient oversight" of automated vehicles testing by state officials in Arizona.
NTSB investigators also described widespread problems with Uber's technology at the time of the crash, saying the act of pushing a bike across the road essentially stumped Uber's system.
NTSB investigators found that Uber's system classified Herzberg as a vehicle, a bike and "an other" - but not as a person walking in the road. That meant the system failed to predict "her goal as a jaywalking pedestrian."
A Volvo safety system that would have braked for Herzberg was disabled by Uber because of a technical conflict with its own system. The automated system did not act to slow or avoid her. Vasquez began trying to turn to avoid Herzberg ".02 seconds prior to impact, and initiated braking 0.72 seconds after impact," according to the NTSB.
Uber said it has made far-reaching management and technological fixes in the intervening months to address problems identified by the NTSB. The NTSB, in turn, praised the company for what it called Uber's transparency and for adding numerous safety checks in the development of its self-driving program.
Uber said it has increased the rigor of the company's track testing; improved training for "mission specialists" whose job it is to be the human backup when automation fails; formed an independent safety review board to identify "potential risks"; and launched an overall safety management system.
The NTSB recommended that Uber "complete the implementation of a safety management system for automated driving system testing that, at a minimum, includes safety policy, safety risk management, safety assurance, and safety promotion."
The head of safety for Uber's self-driving division, former NHTSA official Nat Beuse, said the company deeply regrets Herzberg's death, and "we remain committed to improving the safety of our self-driving program."
Beuse said the company has provided the NTSB "complete access to information about our technology and the developments we have made since the crash. While we are proud of our progress, we will never lose sight of what brought us here or our responsibility to continue raising the bar on safety."
Earlier this year, a local Arizona prosecutor found "no basis for criminal liability for the Uber corporation" from the crash.
Tempe police investigated Vasquez and recommended she be charged with manslaughter. A spokeswoman for the Maricopa County attorney's office said prosecutors have not decided whether to bring charges.
NTSB chairman Robert Sumwalt said the independent agency is seeking to spread the lessons of Tempe broadly before additional deaths occur.
"If your company tests automated driving systems on public roads, this crash was about you. If you use roads where automated driving systems are being tested, this crash was about you," he said. "If your work touches on automated driving systems at the federal or state level of government, this crash was about you."
Sumwalt also took a jab at Tesla chief Elon Musk, and said he appreciated Uber's openness with the NTSB.
"I did notice when I talked to their CEO he did not hang up on me," Sumwalt said, referring to Uber.
"We were dealing with another automobile manufacturer who wasn't necessarily following the rules" regarding the NTSB's investigative process, Sumwalt said. "We had to remove that other organization" as a party, Sumwalt said.
While Sumwalt did not cite Tesla or Musk by name, he was clearly making a reference to a highly unusual public dispute last year between the safety agency and the electric-car builder.
The dispute centered on the investigation of crash in which Tesla owner Walter Huang, 38, was killed when his 2017 Model X, running in semi-automated "Autopilot" mode, hit a concrete median near Mountain View, California.
The NTSB said Tesla released information publicly, violating the agency's investigative rules. Tesla disputed that it had done anything wrong, and said the NTSB itself had released "partial bits of incomplete information."
A Tesla official did not respond to a message seeking comment.