Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Los Angeles Times
Los Angeles Times
Business
Russ Mitchell

Most driver-assist crashes involved Teslas, new data show. But questions abound

How safe are automated driving systems? Are some safer than others?

Seven years after Tesla began selling cars equipped with what it calls Autopilot, auto safety regulators are still unable to answer these basic and vital questions.

But they took a step toward being able to do so Wednesday with the National Highway Traffic Safety Administration's first report on crashes involving advanced driver assistance systems.

The numbers are suggestive, with Tesla accounting for 70% of all crashes involving "Level 2" driving systems, which include adaptive cruise control plus automated lane-keeping and can encompass more advanced features, such as automatic lane changing. That figure is sure to provide ammunition for critics who say Elon Musk's company has taken a reckless approach to rolling out unproven technology.

But far more detail and context are required before regulators can say definitively whether such systems can outperform human drivers, or one another.

"The data may raise more questions than they answer," NHTSA head Steven Cliff told reporters.

In June 2021, the agency required carmakers to report serious crashes involving Level 2 systems. The numbers reported Wednesday reflect crashes that have occurred from that time through May 15 this year.

Of all the crashes that occurred over that period of time by all cars, automakers reported that 392 involved automated driver assist systems.

Of those, 273 were reported by Tesla, 90 by Honda, 10 by Subaru, with others reporting serious crashes in single digits.

"These data provide limited insight into hundreds of crashes," said Bryant Walker Smith, a professor who specializes in automated vehicle law at the University of South Carolina School of Law. "But in the same period there were literally millions of other crashes."

But no one should conclude that Level 2 systems are safer than cars operated by human drivers alone, he said. They might be, they might not. The NHTSA data are far too broad to reach any such conclusions, he said.

The data don't include the number of automated systems each company has on the road or the total vehicle miles traveled with Level 2 systems engaged. NHTSA had no comment on how thorough each company's reporting procedures might be. The agency plans monthly reports.

Crashes that were prevented by automated systems "are obviously unreported to the extent that they did not occur," Smith said. A deep look into the cause of reported crashes — the roles played by the system, by the driver, by the system's driver monitoring system, and other conditions on the roadway — would help safety regulators reach firm conclusions, he said.

Last year's crash-data reporting order marked NHTSA's first attempt to fill a deep deficit in knowledge about the real-life safety implications of automated vehicle technology on public roads.

Any vehicle maker's automated system could be safer than human drivers. Or less safe. Data rich enough to reach sound conclusions are scant. Crash data collection systems in the U.S. are decades old, inconsistent, still paper-based at many police departments, and utterly unequipped to determine the role automated systems play in preventing or causing crashes.

"One would have hoped that NHTSA would 'do the work' to make the numbers they publish in summaries really be comparable," Alain Kornhauser, head of the driverless car program at Princeton University, said in an email.

Apart from collecting crash data, NHTSA is investigating why Tesla's cars have been crashing into emergency vehicles parked by the roadside, often with their emergency lights flashing.

The probe was prompted by 11 crashes that led to 17 injuries and one death, including three crashes in Southern California. The number of such crashes has increased to 16. The technology in about 830,000 cars — all Tesla cars sold in the U.S. between 2014 and 2022 — is under investigation.

As part of that investigation, regulators will be looking into the performance of Tesla's automatic emergency braking systems. As the Times reported last year, Tesla drivers report emergency braking problems at a rate far higher than drivers of other makes.

The emergency vehicle probe grew more serious earlier this month, when NHTSA lifted its status to "EA", for engineering analysis. That category means investigators will be taking a deeper look into the technical design and performance of Autopilot. Once an investigation reaches EA, a recall is more likely.

Meanwhile, the California Department of Motor Vehicles continues to investigate whether Tesla is falsely marketing its Full Self-Driving feature, a $12,000 option. Experts in the field overwhelmingly note that the system doesn't come close to being able to safely drive itself.

The DMV review, however, is more than a year old, and the DMV won't say when it might be completed.

State legislators are increasingly concerned about the DMV's seemingly lax approach to Tesla. In December, the chair of the California Senate Transportation Committee, Lena Gonzalez, asked the DMV to provide crash and safety information to the committee. The DMV said it would look into it, and is still looking.

The DMV appears to be allowing Tesla to test self driving cars on public highways without requiring the company to report crashes or system failures, as is required of competitors such as Waymo, Cruise, Argo and Zoox. DMV head Steve Gordon has declined all media requests to discuss the subject since May 2021.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.