Fatal 'Driverless' Tesla Crash Raises Questions About Company's Autopilot System

Fatal 'Driverless' Tesla Crash Raises Questions About Company's Autopilot System

The National Highway Traffic Safety Administration has sent a Special Crash Investigation team to the site. “We are actively engaged with local law enforcement and Tesla to learn more about the details of the crash and will take appropriate steps when we have more information,” a spokesperson for the agency told CR in an email. Herman told CR that the National Transportation Safety Board (NTSB) also will investigate, and that investigators have requested access to the Tesla’s event data recorder, also known as a “black box,” which can offer more information about how fast the vehicle was traveling, which systems were engaged, and whether a driver was in the front seat.

"We hope to get all that through search warrants and subpoenas,” he told CR. “It would include black box data and any relevant cloud data that Tesla has."

Tesla has indicated that it uses the driver-facing in-cabin camera in its Model X and Y vehicles to monitor and ban drivers from using certain vehicle systems if the in-car cameras detect abuse, but the Model S lacks an in-car camera. Funkhouser points out that banning a driver after the fact does not stop them from abusing the system in the first place—when it could save lives.

“If there is any possibility of a driver misusing a system, auto manufacturers need to put in safeguards,” she says. “At a minimum, the systems should be able to detect if there is a human in the driver’s seat, that they are awake, and that they are looking forward.”

Joe Young, spokesman for the Insurance Institute for Highway Safety (IIHS), told CR that all vehicles that incorporate automation should include driver monitoring. But he also said that the very name “Autopilot” may mislead drivers, as IIHS research showed that what automakers call or name advanced driving systems plays a big part in what drivers believe those systems are capable of. “They need to be named properly, otherwise people may think they're something they're not,” he tells CR.

Sam Abuelsamid, principal analyst at Guidehouse Insights, a market research firm, agrees. He points to videos of Tesla CEO Elon Musk doing TV interviews in moving Tesla vehicles without his hands on the wheel, videos that Musk retweeted of Tesla fans not paying attention while using Autopilot, and frequent promises that Tesla vehicles would be capable of full autonomy and used by owners as “robotaxis.”

“While the Tesla fine print says the driver is responsible and must keep hands on the wheel and eyes on the road, the actions of the CEO have repeatedly contradicted this,” he told CR. “Combined with mainstream media repeating his claims without any question and calling Teslas ‘self-driving’ or ‘autonomous’ in headlines, many ordinary people have the impression that the vehicles are more capable than they are.”

Vehicle automation experts took to Twitter in the immediate aftermath of the crash to voice their concerns about Autopilot.

Missy Cummings, a former fighter pilot who now directs the humans and autonomy lab at Duke University in Durham, N.C., called the crash “inevitable,” and suggested that simple technology could prevent the problem. “If we can sense whether there is weight in the front right seat and turn off the airbag, we can sense when no weight is in the driver seat & stop the car,” she wrote.

Carnegie Mellon professor Costa Samaras, who researches autonomy and climate change, suggested that Tesla change the name of Autopilot to avoid misleading drivers. “This tragedy was potentially avoidable. These aren't driverless vehicles. Driverless vehicles do not yet exist,” he wrote.

Last year, the NTSB warned that drivers are placing too much trust in Autopilot, and that federal regulators aren’t doing enough to make sure automakers are deploying their systems safely. In its report, the NTSB argued that Tesla has not taken adequate action to prevent drivers from abusing Autopilot, and that NHTSA—which writes and enforces vehicle safety regulations—must set standards that could prevent fatalities from happening instead of just investigating them afterward.

“The NTSB, CR, and other safety experts have been really clear about the hazards and how to address them. It’s been several years. Tesla still hasn’t put needed safeguards in place and NHTSA hasn’t forced the company to do so,” says William Wallace, manager of safety policy at CR. “Every day without action is another day we might see a preventable crash and loss of life. It's incredibly hard to trust Tesla at this time to put safety first, so it’s especially urgent for NHTSA to take action now.”

Images Powered by Shutterstock