Thursday, December 03, 2020

Tesla is putting 'self-driving' in the hands of drivers amid criticism the tech is not ready

Oct 22. 2020
Inside the Tesla Model 3, the dashboard is mostly contained in a touchscreen in the center of the front console. There's no information directly behind of the steering wheel. MUST CREDIT: Washington Post photo by Jhaan Elker.
Inside the Tesla Model 3, the dashboard is mostly contained in a touchscreen in the center of the front console. There's no information directly behind of the steering wheel. MUST CREDIT: Washington Post photo by Jhaan Elker.
Facebook Twitter

By The Washington Post · Faiz Siddiqui · NATIONAL, BUSINESS, TECHNOLOGY, TRANSPORTATION, US-GLOBAL-MARKETS 

SAN FRANCISCO - In a year when Tesla might have been forgiven for extending its timeline on a key initiative, Elon Musk is forging ahead with a vision for what he calls "Full Self-Driving."

This week, a group of drivers was selected to receive a software update that downloaded automatically into their cars, enabling the vehicles to better steer and accelerate without human hands and feet. According to Tesla, hundreds of thousands of its cars will be able to drive themselves as soon as this year, probably making them the first large fleet of vehicles billed as autonomous owned by ordinary consumers.

Tesla is forging ahead despite skepticism among some safety advocates about whether Tesla's technology is ready - and whether the rest of the world is ready for cars that drive themselves. An industry coalition consisting of General Motors' Cruise, Ford, Uber and Waymo, among others, this week criticized the move by Tesla, saying its vehicles are not truly autonomous because they still require an active driver.

Self-driving is lightly regulated in the United States, and Tesla does not need permission to launch the new feature.

A point of contention among Tesla's critics is that the company is moving ahead without a key piece of hardware. Nearly all self-driving carmakers have embraced lidar sensors, which are placed on the outside of vehicles and can detect the precise size, shape and depth of objects in real time, even in bad weather.

Instead, Tesla is trying to achieve full self-driving with a suite of cameras and a type of radar that are constantly connected to an advanced neural network. Tesla's technology can detect vehicles and pedestrians in the road and some objects such as trees, but it cannot always see the true shape or depth of the obstacles it encounters, according to some safety experts. That might not allow the car to distinguish between a box truck and a semi as it approached the rig from behind, for example.

Tesla CEO Elon Musk has decried lidar as "expensive," redundant and "a fool's errand," calling anyone who relied on it "doomed."

In response to an analyst question during the company's analyst call on Wednesday, Musk said he would not equip the company's vehicles with lidar even if it were "totally free."

In addition, unlike autonomous vehicle companies such as Waymo and Cruise, which have been testing their self-driving cars in controlled pilot programs, Tesla has decided to put its self-driving technology into the hands of consumers. That means the risks of a malfunction will be absorbed by ordinary drivers.

Tesla did not respond to requests for comment. The company has said it will not activate full self-driving until it receives regulatory approval, though it remains unknown exactly what certification would be needed. Musk said on Twitter the self-driving beta rollout would be "extremely slow & cautious, as it should."

The company reported its quarterly earnings Wednesday afternoon, posting a $331 million profit. Tesla sold $397 million worth of regulatory credits to other automakers in the quarter, similar to the pattern that has generated its past few quarters' gains. It also touted its full self-driving rollout to the select group of users this week, which it said "will allow the remaining driving features" of its system "to be released." 

Tesla added that "as we continue to collect data over time, the system will become more robust."

"We're starting very slow, and very cautiously, because the world is a complex and messy place," he said. "We'll see how it goes and probably release it to more people this weekend or early next week and then just gradually step it up until we have hopefully a wide release by the end of this year."

The company's stock rose nearly 3% in after hours trading to $435.50. Tesla built a network of connected cars. What happens when it goes down? Demonstrating the challenges of putting the features into users' hands, in one such recent update, some Tesla cars could detect red lights and stop signs but would not proceed through the intersection until the driver confirmed via the accelerator or steering wheel stalk that the traffic light was green, according to Tesla.

"The fundamental challenge of neural nets is achieving sufficient reliability to use in a safety-critical system," said Edward Niedermeyer, communications director for the Partners for Automated Vehicle Education (PAVE) campaign, a coalition of nonprofits seeking to help the public better understand driverless technology.

"I'm puzzled as to where the confidence came from almost four years ago that they'd be able to do this," said Niedermeyer, who wrote the 2019 book "Ludicrous: The Unvarnished Story of Tesla Motors." "The reason you do these things is because it's an extremely hard problem, and it's not realistic to solve this problem with some cameras."

Silicon Valley regards autonomous vehicles as the holy grail of transportation's future, enabling customers to deploy their cars as driverless robotaxis, making the owners money even when they would be typically parked in the garage, in Tesla's case. It could also shrink the cost of an Uber or Lyft trip to just cents on the mile by eliminating the need to pay a driver.

Several companies are making slow but steady progress on that goal, too. Waymo announced this month it would be launching driverless vehicles in the Phoenix metro area, becoming the first entity to bring the vision of fully autonomous cars to consumers as part of a dedicated ride-hailing service. Last week, Cruise said it would launch driverless cars in San Francisco, becoming the first company to debut unmanned vehicles in such a complex city environment and the country's second-densest metropolis.

And on Wednesday, Cruise announced it was seeking the federal government's permission to put its dedicated driverless vehicle, called the Origin, into use - ushering in the era of self-driving vehicles without steering wheels or pedals.Tesla floats fully self-driving cars as soon as this year. Many are worried about what that will unleash.

Tesla's public timeline has been rapid. Musk promised in 2019 Tesla would have 1 million robotaxis on the road by 2020, a reference to the company's full self-driving ambitions.

Musk said on the call with analysts that Tesla aims to make the feature widely available to owners by the end of the year.

The company's self-driving technology will make use of the eight surrounding-view cameras attached to its cars. Those cameras collect critical data on how to navigate chaotic freeways, labyrinthine city streets and dense traffic.

Musk has said the new software being delivered this week will better capture the view outside the cars, more seamlessly integrate the footage Tesla collects, creating a kind of stitched-together, multidimensional view. It will collect data that the company's engineers can label and help the computers better interpret. The cameras would replicate a core function of lidar, seeing what is happening around the cars.

In essence, Tesla is aiming to compensate for its hardware limitations by supercharging its software, almost to create a virtual lidar using Tesla's existing suite of cameras, said Eshak Mir, a former Tesla Autopilot engineer who reviewed and worked with data aimed at training Tesla's neural network.

"They're trying to combine all the feeds from the cameras into one full video and label it in real time," Mir said. "With that, you'll be able to pick up a full sense of depth."

There is no true industry hardware standard for a self-driving car. But before Tesla came along, there was little question that a sophisticated sensor in the vein of lidar was necessary for the redundancy and complex image processing required of self-driving vehicles. Some experts continue to hold that view.

Overcast skies, rain, snowstorms and especially bright sunlight can challenge mere cameras' perception. "In normal daylight conditions, the cameras work perfectly fine," Mir said.

"Just from my experience, cameras are very dependable, but at the same time there can be a challenge when there's harsh conditions," added Mir, who supports Tesla's current approach.

But safety advocates objected to Tesla's rollout of features that are still in testing, saying it is dangerous to blur the line between driver assistance and autonomy.

"Public road testing is a serious responsibility and using untrained consumers to validate beta-level software on public roads is dangerous and inconsistent with existing guidance and industry norms," the PAVE campaign said in a statement issued through Niedermeyer. "Moreover, it is extremely important to clarify the line between driver assistance and autonomy. Systems requiring human driver oversight are not self-driving and should not be called self-driving."

For the broader, lidar-equipped fleets, safety setbacks - including a fatal crash when a pedestrian was struck by a self-driving Uber in 2018 - have led to delays and slower timelines for autonomous vehicles as a whole. And some are questioning whether truly driverless vehicles are possible.

"They say that it's just around the corner, but you don't realize that the effort to get just around the corner gets more and more and more [complicated] as you get closer to the corner," said Ted Pavlic, an Arizona State University assistant professor in the School of Computing, Informatics, and Decision Systems Engineering, who works in robotics and autonomous systems.

Companies developing dedicated robocars for ride-hailing purposes, such as Waymo, Amazon-acquired Zoox, Uber and Cruise, all use lidar in their vehicles. They consider lidar a critical element of redundancy capable of making rapid-fire observations in all manner of conditions, filling in gaps where the cameras fall short.

On a recent autonomous vehicle trip in downtown San Francisco, for example, the lidar sensor spotted vehicle traffic over a steep hill before the camera suite or view out the windshield showed it, and the car began making adjustments earlier than a human driver might have. Most testing of autonomous vehicles has been with lidar.

Waymo conducted 1.45 million miles' worth of autonomous vehicle testing in California last year, the company reported to the state Department of Motor Vehicles. Tesla vehicles drove a total of 12.2 autonomous miles, to record what it called a "demo run" around its Palo Alto headquarters. Tesla has argued that it "has a fleet of hundreds of thousands of customer-owned vehicles that test autonomous technology in 'shadow-mode' during their normal operation," constantly improving through billions of miles of real-world driving. Shadow Mode allows it to test some of those automated features without actually activating them in the real world.

Still, Tesla has been dogged by safety concerns, including regulatory investigations and multiple crashes involving Autopilot that have resulted in fatalities and injuries. The National Highway Traffic Safety Administration has said it is looking into more than a dozen incidents involving the Autopilot software. Tesla has also faced lawsuits from the families of victims in Autopilot-related crashes.

Tesla has repeatedly defended the Autopilot system, saying it is merely there to assist the driver, who is ultimately responsible for the safe operation of the car.

Autopilot, Tesla's driver-assistance system that operates like an advanced cruise control function, has been criticized for giving users an exaggerated view of its cars' capabilities. At this stage, the cars are capable of highway driving from on-ramp to off-ramp, self-parking and summoning - where they can navigate to the driver in a crowded parking lot, for example. In cities, Tesla's vehicles can detect traffic lights and stop signs. It is not autonomous, however, and Tesla has faced criticism for giving users the impression the system is capable of driving the car itself - without supervision.

"Autopilot is not an autonomous system and does not make our vehicles autonomous," the company noted in a disclosure to the state DMV. California's vehicle codes, for example, state that autonomous test vehicles must be capable of "performing the dynamic driving task on a sustained basis without the constant control or active monitoring of a natural person."

Pavlic, who works with autonomous systems, recently purchased a Tesla Model 3 but didn't opt for the $8,000 "Full Self-Driving" package.

He said Tesla risks giving users an exaggerated impression of the cars' capabilities with the over-the-air updates, various iterations of Autopilot and "Full Self-Driving" marketing.

"It requires you to be very educated to be able to parse these things," he said. "I would say I can definitely see how someone might think that Autopilot did more than it did . . . as they're rolling out these new features."

Tesla owners are no stranger to the challenges, observing how new and previous unpredicted sights can leave their cars confused.

Zlatko Unger, a 36-year-old Model 3 owner who lives in Redwood City, Calif., recalled taking his car to a horse park he frequents on a weekend in late July, when his car detected a hazard it displayed on its info screen.

"I noticed it picked up the piles of poop as [traffic] cones, and I was like, 'Hey, this is not right,' " he said.

 

Tags:
Facebook Twitter
More in Auto
Editor’s Picks
wmg-logo
Top News
wmg-logo