An employee drives a Tesla Motors Inc. Model S electric automobile, equipped with Autopilot hardware and software, hands-free on a highway.
Jasper Juinen | Bloomberg | Getty Images
Two consumer safety groups are calling for federal and state investigations of Tesla’s semi-autonomous technology in the wake of several fatal crashes linked to the system earlier this year.
The investigations could pose a major threat to the California electric vehicle maker, as Tesla CEO Elon Musk has promised a fully autonomous version of his technology, called Autopilot, will be released this year. He said during a conference call this week that the company expects to generate significant revenue from fleets of “robotaxis” it intends to roll out in 2020 using Autopilot.
“We feel Tesla violates the laws on deceptive practices, both at the federal and state level,” said Jason Levine, the head of the Washington, D.C.-based Center for Auto Safety, one of two groups that have called for an investigation of both the Autopilot system and Tesla’s promotion of the technology.
The CAS, along with California’s non-profit Consumer Watchdog, pointed to a number of crashes, injuries and deaths that have occurred over the last several years involving Tesla vehicles operating in Autopilot mode. That includes one in May in which a Model S sedan slammed into a parked police car. Two months earlier, a driver was killed when his Model 3 sedan slammed into a semi-trailer in Del Ray Beach, Florida, shearing off its roof.
First introduced in October 2015, Autopilot is what is known, in industry terms, as an Advanced Driver Assistance System, or ADAS. A number of other manufacturers have launched similar technologies, such as Cadillac’s Super Cruise and Audi’s Traffic Jam Pilot. These systems can, under very limited circumstances, permit a driver to briefly take hands off the wheel but always require the motorist to be ready to take immediate control in an emergency.
But a recent study by the Insurance Institute for Highway Safety found that a sizable percentage of owners misunderstand the capabilities of these systems, especially their limitations. The survey of 2,000 owners found that to be particularly true with AutoPilot. Fully half of the respondents thought that Autopilot allowed a driver to take their hands completely off the wheel. For similar systems, the responses ranged from 20% to just over 30%.
The Autopilot name, itself, was misleading, the IIHS said in its analysis, and “signals to drivers that they can turn their thoughts and their eyes elsewhere.”
In the March 1 crash in Florida, the National Transportation Safety Board determined the driver switched on Autopilot 10 seconds before impact and didn’t have his hands on the wheel for the final eight seconds.
The agency has made similar findings in other crashes, several of them also fatal.
For its part, Tesla has defended Autopilot. In a statement released in May, it said, “As our quarterly safety reports have shown, drivers using Autopilot register fewer accidents per mile than those driving without it.”
That has not been backed up by independent research, however, and Tesla has had to back off of claims that its safety record was supported by the National Highway Traffic Safety Administration.
The automaker also said in a statement that there is nothing about the name, Autopilot, that should mislead consumers.
“Presumably they are equally opposed to the name “Automobile,” the statement suggested. The company also argued that it has gone to great lengths to make consumers aware of the limits of the system, in its owner’s manuals, on its website and elsewhere.
CAS’s Levine dismisses such claims as “legalese,” citing the many ways Tesla and Musk have promoted the system. That includes pictures released soon after Autopilot debuted, including ones showing Musk and his then-wife driving off with their hands waving out the windows of a Tesla vehicle. Musk also appeared to imply the system could work hands-free during a December 2018 interview on the CBS newsmagazine, “60 Minutes.”
“They can say they’ve written language to cover their liabilities but their actions portray a desire to deceive consumers,” said Levine, in an interview.
Together with Consumer Watchdog, the Center wants both the Federal Trade Commission and the California Department of Motor Vehicles to launch immediate probes. The groups contend the automaker violated Section 5 of the FTC Act, as well as California consumer law, arguing that the way Tesla markets Autopilot is “materially deceptive and … likely to mislead consumers into reasonably believing that their vehicles have self-driving or autonomous capabilities.”
Despite such concerns, Tesla has been working to update the Autopilot system and CEO Musk earlier this month repeated earlier promises to introduce a “full self-driving” version before the end of the summer. Last year, it rolled back new features that had claimed to allow true hands-free operation amid criticism that the update was not able to meet expectations.
But, during a conference call with analysts and investors on Tuesday, Musk said the upcoming upgrade “will be quite compelling.”
The CEO has promised to put as many as 1 million robotaxis on the road by 2020, a direct challenge to such ride-sharing services as Uber and Lyft that are working on their own self-driving technologies.
Musk has indicated that the service would provide a new source of revenue for the company. On Wednesday, Tesla posted a $1.12 a share loss for the second quarter, after adjustments, which was far wider than the 40 cents per share analysts surveyed by Refinitiv were expecting. Shares have fallen sharply since the report. Anything that could disrupt that program could complicate Tesla’s struggles to turn its finances around.
“There is no question the (Autopilot) technology is impressive,” said CAS chief Levine, but Tesla’s continued reliance on what he called “hyperbolic statements” misleads consumers and poses serious safety risks.