FRIDAY, March 29, 2024
nationthailand

Tesla Autopilot faces U.S. safety regulators scrutiny after crashes with emergency vehicles

Tesla Autopilot faces U.S. safety regulators scrutiny after crashes with emergency vehicles

The nations top auto safety watchdog has launched a formal investigation of Teslas driver-assistance system after nearly a dozen crashes involving parked emergency vehicles occurred while Autopilot was engaged.

The National Highway Traffic Safety Administration inquiry, which was opened Friday and detailed in a document made public Monday, covers 765,000 Teslas - models Y, X, S and 3 - produced from 2014 to 2021. In 11 crashes recorded since 2018, one person was killed and 17 people injured.

The Palo Alto, Calif.-based automaker, which has disbanded its public relations department, did not respond to an emailed request for comment.

The investigation comes as the auto industry races toward a driverless future. Tesla's driver-assistance suite - starting with Autopilot and then its "Full Self-Driving" package - is among the most ambitious in its efforts to complete many driving tasks automatically. Waymo, owned by Google parent company Alphabet, is also working on a self-driving car, along with an array of competitors sporting names such as Aurora, Cruise and Zoox.

Although regulators and industry experts caution that driver-assistance features do not make a vehicle autonomous, Tesla has sought to capitalize on the perception that such technology will one day enable their vehicles to drive themselves.

The investigation creates a potential wrinkle in those plans. Tesla has put "Full Self-Driving" in hundreds of vehicles with the caveat that drivers still must be alert while their cars are in motion. Regulators, so far, have taken a relatively hands-off approach; experts say they are wary of any perception they might be stifling innovation.

The inquiry may also signal a new regulatory environment. In June, authorities started requiring manufacturers, including Tesla, to report crashes involving such technology within a day of learning of the incident. On Monday, an NHTSA spokeswoman told The Post that the preliminary investigation will focus on Tesla's autopilot system and "technologies and methods used to monitor, assist, and enforce the driver's engagement with driving while Autopilot is in use."

Sens. Richard Blumenthal, D-Conn., and Edward Markey, D-Mass., who serve on the Senate Commerce, Science and Transportation Committee, said NHTSA is "rightly investigating" Tesla's autopilot after a series of crashes.

"This probe must be swift, thorough, and transparent to ensure driver and public safety," they said in a joint statement. "It should inform the agency's recommendations on fixes the company must implement to improve the safety of its automated driving and driver assistance technology and prevent future crashes."

Tesla shares tumbled following word of the federal probe, falling 4.3% to close Monday at $686.17.

The agency is examining two systems: Tesla Autopilot and the Traffic Aware Cruise Control. Both are "partial" self-driving car systems designed to recognize and avoid oncoming traffic while a human driver retains control of the vehicle. Neither qualifies as fully autonomous, and drivers are expected to be at the ready even when the systems are turned on.

The investigation carries several touchpoints that suggest a more sophisticated, focused approach to ensuring self-driving and semiautonomous cars are safe. The difference, experts say, is that authorities now appear to be taking a hard look at how human drivers interact with these new automated systems.

According to the announcement, investigators will look into the so-called "operational design domain," a term which refers to the range of places and situations in which the autopilot can be turned on. Under current Tesla design, the driver can turn on autopilot essentially anywhere. Investigators will also look into aspects of the vehicle's "operational event detection and response" technology, which refers to how the vehicle understands its surroundings.

Probing both concepts will bring NHTSA ― a relatively obscure unit of the Department of Transportation whose work typically involves more mundane matters like parts recalls ― into deeply technical debates about how drivers interact with automated technology.

The crashes that are the basis for the federal investigation bear some common threads, even beyond the presence of first-responder vehicles. Most, for example, occurred late at night.

In one incident from late February, a Tesla rear-ended a police cruiser that was conducting a traffic stop after 1 a.m., according to the local ABC affiliate. Local news reports describe a dramatic chain reaction crash that totaled two police cars; five officers and a police dog sustained minor injuries. One officer who had been underneath the vehicle at the time of the collision grabbed onto the car and was pulled along. A person who had been standing on the shoulder of the road when the crash happened was taken to the hospital in critical condition, ABC reported at the time.

The earliest crash cited by NHTSA occurred in January 2018 in Culver City, Calif. Robin Geoola, a carwash owner, had been driving his Tesla Model S to work with the autopilot on when it slammed into a firetruck.

"All I remember seeing is just my car stopped and my windshield was shattered, and I didn't know what happened," Geoola recalled in a recent interview. "After I became a little bit more aware, I opened the door. I came out I saw my car was under a firetruck."

Authorities concluded that both driver and vehicle were at fault. The National Transportation Safety Board investigated the crash and concluded that Geoola's "inattention and overreliance" on autopilot, as well as Tesla's design of the system, caused the crash. Geoola said he wasn't ticketed, saying "I didn't break any law."

Geoola had to replace his vehicle after the crash and chose another Tesla, crediting his Model S with saving his life.

"Same color, same year," he said. But there was one key difference: The new one wasn't capable of using Autopilot.

In a May 2018 crash in Laguna Beach, Calif., David Scott Key recalled in a recent interview that he was driving through an area where he likes to go mountain biking and was looking out at the trails. His 2015 Tesla Model S was on autopilot.

"All of a sudden I slammed into a police car," Key said.

The Laguna Beach Police Department concluded that Key caused the crash by driving too quickly, according to the report of its investigation, but a department spokesman said he wasn't issued a ticket.

Key, an engineer, said he remains confident in the vehicle's autopilot system despite the crash.

"It's much, much safer to be on autopilot than it is to be a human driving, however that does not mean there won't be accidents," he said.

Ed Niedermeyer, communications director for the nonprofit Partners for Automated Vehicle Education, said it's critical for drivers to understand that these systems cannot drive on their own.

He said his organization "welcomes regulators' engagement with these serious issues and affirms the importance of clear communication about the human role in any driving automation system."

RELATED
nationthailand