Biz / Auto

Tesla Autopilot probe looks at putting 'innovation over safety'

Autopilot is an advanced driver-assistance feature whose current version does not render vehicles autonomous, Tesla says on its website.

Robin Geoulla had doubts about the automated driving technology in his Tesla Model S when he bought the electric car in 2017.

"It was a little scary to, you know, rely on it and to just, you know, sit back and let it drive," he told a US investigator about Tesla's Autopilot system, describing his initial feelings about the technology.

Geoulla made the comments to the investigator in January 2018, days after his Tesla, with Autopilot engaged, slammed into the back of an unoccupied fire truck parked on a California interstate highway. Over time, Geoulla's initial doubts about Autopilot softened, and he found it reliable when tracking a vehicle in front of him. But he noticed the system sometimes seemed confused when faced with direct sunlight or a vehicle in front of him changing lanes, according to a transcript of his interview with a National Transportation Safety Board investigator.

He was driving into the sun before he rear-ended the fire truck, he told the investigator.

Autopilot's design allowed Geoulla to disengage from driving during his trip, and his hands were off the wheel for almost the entire period of roughly 30 minutes when the technology was activated, the NTSB found.

The US agency, which makes recommendations but lacks enforcement powers, has previously urged regulators at the National Highway Traffic Safety Administration to investigate Autopilot's limitations, potential for driver misuse and possible safety risks following a series of crashes involving the technology, some of them fatal.

"The past has shown the focus has been on innovation over safety and I'm hoping we're at a point where that tide is turning," said the NTSB's new chair, Jennifer Homendy.

She said there is no comparison between Tesla's Autopilot and the more rigorous autopilot systems used in aviation that involve trained pilots, rules addressing fatigue and testing for drugs and alcohol. Tesla did not respond to written questions for this story.

Autopilot is an advanced driver-assistance feature whose current version does not render vehicles autonomous, the company says on its website. Tesla says that drivers must agree to keep hands on the wheel and maintain control of their vehicles before enabling the system.

Geoulla's 2018 crash is one of 12 accidents involving Autopilot that NHTSA officials are scrutinizing as part of the agency's farthest-reaching investigation since Tesla Inc introduced the semi-autonomous driving system in 2015.

Tesla Autopilot probe looks at putting 'innovation over safety'

Demonstrating a front crash prevention test on a Tesla Model 3 at the research center for the US Insurance Institute for Highway Safety and the Highway Loss Data Institute.

Questions raised

Most of the crashes under investigation occurred after dark or in conditions creating limited visibility such as glaring sunlight, according to an NHTSA statement, NTSB documents and police reports.

That raises questions about Autopilot's capabilities during challenging driving conditions, according to autonomous driving experts.

"NHTSA's enforcement and defect authority is broad, and we will act when we detect an unreasonable risk to public safety," an NHTSA spokesperson said.

Since 2016, US auto safety regulators have separately sent 33 special crash investigation teams to review Tesla crashes involving 11 deaths in which advanced driver assistance systems were suspected of being in use. NHTSA has ruled out Autopilot use in three of those non-fatal crashes.

The current NHTSA investigation here of Autopilot in effect reopens the question of whether the technology is safe. It represents the latest significant challenge for Elon Musk, the Tesla chief executive whose advocacy of driverless cars has helped his company become the world's most valuable automaker in the US.

Tesla charges customers up to US$10,000 for advanced driver assistance features such as lane changing, with a promise to eventually deliver autonomous driving capability to their cars using only cameras and advanced software. Other carmakers and self-driving firms use not only cameras but more expensive hardware including radar and lidar.

Musk has said a Tesla with eight cameras will be far safer than human drivers.

But the camera technology is affected by darkness and sun glare as well as inclement weather conditions such as heavy rain, snow and fog, experts say.

NHTSA documents show regulators also want to know how Tesla vehicles attempt to see flashing lights on emergency vehicles.

"Today's computer vision is far from perfect and will be for the foreseeable future," said Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University.

Special Reports