Barring a last-minute settlement, a jury in San Jose, Calif. will start hearing evidence in the latest lawsuit to challenge Tesla’s deployment of “Autopilot,” the name Elon Musk chose for technology that allows a car to handle routine braking, steering and turning – but falls short of fully automated driving.
The litigation centers on a fatal crash in 2018 the plaintiffs charge was caused by Tesla engineering that failed to properly account for the challenges of getting a human driver to retake control of a car after delegating driving tasks to software and sensors. Tesla has won such cases in the past by blaming drivers for not heeding instructions to pay attention, no matter what.
Earlier Tuesday, the Insurance Institute for Highway Safety weighed into the debate over the proper design for so-called “advanced driver assistance systems” like Autopilot. The IIHS tested 14 ADAS systems and gave 11 of them – including Autopilot – “poor” overall safety ratings. Only one system, the Lexus Teammate deployed at very low volume on the Lexus LS hybrid, earned an “acceptable” score.
The IIHS study, which is summarized here, concludes that there is no evidence that ADAS systems improve safety. Crash data shows no difference in accidents between vehicles with ADAS systems and those without, the IIHS said.
The IIHS report, the ongoing federal investigations of Autopilot and the civil litigation attacking Tesla’s deployment of the technology underscore the problems with the way ADAS systems are being rolled out to consumers.
The IIHS evaluated ADAS systems using safety metrics. But some automakers, such as General Motors, said they consider their ADAS systems to be convenience features, not safety features.
Then there are the names. Autopilot. Super Cruise. ProPILOT Assist. Active Driving Assistant Pro. Blue Cruise. Distronic with Active Steering Assist.
These disparate marketing labels all describe technology that falls short of fully automated driving. Drivers cannot safely engage any of them and check out by taking a nap or playing a video game. But each has different levels of driver monitoring, different behavior when disengaging and different types of warnings to drivers who aren’t paying attention.
The Society of Automotive Engineers pleaded four years ago for standard naming conventions. No such luck.
It’s been 20 years this month since the U.S. Defense Department launched its DARPA Grand Challenge to inspire American engineers to develop self-driving vehicles.
That military science project gave birth to an industry gold rush. It started with visions of widespread deployment of driverless cars by early in this decade. That didn’t happen. Instead, Tesla began charging thousands extra for partial automation technology and rival automakers followed suit. The IIHS and the Tesla trials signal a new phase of tough questions about the way autopilots-for-cars are being sold to the public.
What’s next? Look for calls on U.S. regulators to impose order. European regulators are already steps ahead.