FRIDAY, March 29, 2024
nationthailand

Deadly Tesla crash tied to technology and human failures, NTSB says

Deadly Tesla crash tied to technology and human failures, NTSB says

Both car and driver contributed to the 2018 crash of a Tesla in California that left a father of two dead, and federal regulators have shown a "lack of leadership" in addressing safety problems with partly automated vehicles, investigators said Tuesday. 

The investigation underscored questions about the safety and marketing of Tesla's "Autopilot" system.

Robert L. Sumwalt III, chairman of the National Transportation Safety Board, pointed to a "lack of system safeguards to prevent foreseeable misuses of technology." 

"Industry keeps implementing technology in such a way that people can get injured or killed," Sumwalt said. "If you own a car with partial automation, you do not own a self-driving car. Don't pretend that you do."

The NTSB also cited "shortfalls" in how the U.S. Department of Transportation has overseen partial automation in cars made by Tesla and other manufacturers. NTSB officials said that although the National Highway Traffic Safety Administration has numerous open investigations into Tesla vehicles, the regulator has been "misguided, because it essentially relies on waiting for problems to occur rather than addressing safety issues proactively."

NHTSA said it is reviewing the findings and has long recommended that technology developers use "appropriate driver-vehicle interaction strategies in deployed technology" and has offered resources to make it easier for them to do so. 

The March 23 crash that killed Walter Huang after he dropped off his son at preschool was just one of 36,560 road deaths in the United States in 2018. 

But it drew broad - and, to Tesla, unwelcome - attention to the company and potential problems with the high-end technology that is so central to the Silicon Valley electric car pioneer's brand. After the crash, Tesla placed blame on Huang, an Apple engineer, saying "the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road."

But days before Huang's Tesla drove off Highway 101 and into a barrier in Mountain View, the car's Autopilot had made a "left steering movement toward" the same area, the NTSB found. Huang caught it in time, investigators said, and he told relatives the same problem had occurred in the same place multiple times before.

Despite that history, Huang was over-reliant on the Autopilot system, the NTSB found, noting that he used his iPhone 8 while behind the wheel and that a strategy game called Three Kingdoms was "active during his commute to work."

The five-member NTSB board on Tuesday unanimously found the crash was caused by "system limitations" in Tesla's "Autopilot" feature, "and the driver's lack of response due to distraction likely from a cellphone game application and overreliance on the Autopilot partial driving automation system."

The NTSB also said Tesla's" ineffective monitoring of driver engagement" - it gauges the "torque" a driver puts on the steering wheel as a proxy for whether the person is paying attention while using Autopilot - also contributed to the crash and "facilitated the driver's complacency." 

And it said Huang likely would not have been killed if California state transportation officials had fixed a crash "attenuator" that was supposed to offer protection when a car hits a highway barrier; it had been hit earlier and not yet fixed.

Tesla's collision avoidance systems "were not designed to, and did not, detect the crash attenuator," so there was no warning to Huang as his car headed toward the barrier and there was no automatic emergency braking, the NTSB said. 

The car veered off the road "due to limitations of the Tesla Autopilot vision system's processing software to accurately maintain the appropriate lane of travel," the NTSB said. And the company did not address an earlier recommendation that it should restrict use of Autopilot to the particular conditions it is designed to handle, investigators said.

Tesla executives did not respond to questions about the technological problems or what had been done to fix them.

The company, in a letter to Sen. Edward J. Markey (D-Mass.) in December, provided data indicating that its customers who use Autopilot are significantly less likely to crash and said "making sure the driver is attentive and able to take over at any time is a cornerstone of our feature development and validation, and something we continue to improve . . ."

In a 2018 statement, Telsa said, "according to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location."

Huang's family has sued Tesla, saying that "based on Tesla's advertising and promotional material" Huang "reasonably believed the 2017 Tesla Model X vehicle was safer than a human-operated vehicle because of Defendant's claimed technical superiority regarding the vehicle's autopilot system."

"The vehicle should not leave a marked travel lane and accelerate, without the input of the operator, in such a way as to cause damage, harm or injury," according to the lawsuit filed in Santa Clara County Superior Court. 

Among its recommendations, the NTSB called on NHTSA to evaluate whether Tesla's partially automated systems "pose an unreasonable risk to safety" and to use NHTSA's "applicable enforcement authority to ensure that Tesla Inc. takes corrective action" if they do. It also called on phone and other electronics makers, including Apple, to install a "lockout mechanism" that would disable distracting functions when a car is moving, among other measures. 

Tesla executives have bristled at the NTSB scrutiny, which led to an unusual public dust-up and flashes of mutual frustration.

In a rare move, Tesla was removed as a party to the investigation. The NTSB, an independent federal body that investigates airplane, boat and highway crashes, generally brings manufacturers, government officials and others together for the painstaking process of deciphering the probable causes of major crashes. That process can take a year or two, an eternity in the tech world, but the results are widely trusted.

Tesla's party status was revoked after the automaker released information about the crash. Tesla says it chose to withdraw and accused the NTSB of being "more concerned with press headlines than actually promoting safety."

At a November hearing on the federal investigation into the fatal crash of a self-driving Uber SUV, Sumwalt pointed to what he called Uber's openness with investigators, and contrasted it with the approach of Tesla chief executive Elon Musk.

"I did notice when I talked to their CEO he did not hang up on me," Sumwalt said, referring to Uber.

"We chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot - claims which made it seem as though Autopilot creates safety problems when the opposite is true," Tesla said in a 2018 statement.

The company also noted that NHTSA, which is part of the U.S. Department of Transportation, regulates automobiles, adding that the company has "a strong and positive relationship" with NHTSA.

"Lot of respect for NTSB, but NHTSA regulates cars, not NTSB, which is an advisory body. Tesla releases critical crash data affecting public safety immediately & always will. To do otherwise would be unsafe," Musk tweeted in April 2018.

Huang's crash was not the first time a Tesla driver was killed while using Autopilot, nor would it be the last, and NTSB officials voiced frustration because they said sufficient changes had not been made by the company and others.

In 2016, a speeding Tesla driver in Williston, Fla., crashed into a truck that turned in front of him. At the time, Tesla said "neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied."

Musk later said that it was "very likely" a new radar-based braking system would have prevented the crash.

In 2017, NHTSA said it found no defects in the Autopilot system in use at the time of that crash. The regulator's broader federal review of dozens of Autopilot crashes did point to industry-wide challenges.

"Many of the crashes appear to involve driver behavior factors, including traveling too fast for conditions, mode confusion, and distraction," the investigators wrote. Mode confusion is the idea that it is unclear who is in control, the car or the driver, and occurred "during attempted Autopilot activations" and "after inadvertent overrides."

In 2019, a driver using Autopilot in Delray Beach, Fla., crashed into a tractor-trailer that emerged from a private driveway and slowed in the middle of a highway, blocking the Tesla's path, the NTSB said.

Neither the Tesla driver nor the Autopilot system "executed evasive maneuvers," and the car's roof sheared off as it went under the truck, according to the NTSB.

 

RELATED
nationthailand