Self-driving cars are not especially scary. They don’t suddenly turn on their headlights and accelerate into teenagers á la Christine, and outside of a few pilot programs in places like San Francisco, Phoenix, and Pittsburgh, they don’t really roam the streets. Instead, autonomously driven cars remain an abstraction. There are only hundreds of them across the country, and we mostly hear about them when they crash or cause a fatality — something that has happened four times in the U.S. over the past three years. Nonetheless, people are afraid: a survey last year conducted by AAA found that 71 percent of respondents were fearful of fully self-driving cars.
A publicly broadcast board meeting of the National Transportation Safety Board (NTSB) on Tuesday gives some explanation as to why. The meeting concerned a May 2018 collision in the San Francisco Bay Area between a Tesla Model X SUV and a highway barrier. The crash resulted in the death of the Tesla’s driver, a married father of two named Peter Huang, who worked as an engineer at Apple. According to the NTSB’s investigation, Huang appears to have been playing a video game on his phone at the time of the crash, and his Tesla’s so-called “Autopilot” feature was engaged. The NTSB’s conclusion was that while the driver bore some responsibility for the accident, Tesla was also culpable. In his opening remarks, NTSB chairman Robert Sumwalt singled out Tesla’s unwillingness to work with regulators, alongside consumers’ blithe and irresponsible use of self-driving features in their cars.
"If you own a car with partial automation, do you not own a self-driving car. So don’t pretend that you do,” Sumwalt said. “This means that when driving in the supposed self-driving mode you can’t sleep. You can't read a book. You can’t watch a movie or TV show. You can’t text. And you can’t play video games. Yet that’s precisely what we found that this driver was doing.” He further noted that of the six companies that were asked to respond to NTSB recommendations for dealing with self-driving safety last year, Tesla was the only one not to respond.
CEOs and tech visionaries once promised that we’d all be riding in fully driverless Ubers by 2030, but we can consider that timeline now pretty much discredited. What is instead taking place is a more gradual adjustment, with the introduction of technologies that take progressively more tasks away from the human driver. Though fears of self-driving cars have prompted legitimate bigger-picture questions about labor, data collection, and public mobility, those will matter more in the coming two decades, not years. The most immediate concerns, as evidenced by the NTSB’s findings and Sumwalt’s remarks, are about safety, the technologies that are already being put into our cars, and, most critically, how people use them.
Part of the problem is that Autopilot has helped set the standard for advanced driver assist tools, even if they aren’t branded in the extreme way that Autopilot has been. Just as carmakers were pushed into developing more all-electric vehicles by Tesla (and into rolling out huge, distracting touchscreen consoles on dashboards), so too have they been pushed to introduce advanced driver assistance technologies. Subaru, Nissan, Ford, Chevrolet all offer some form of adaptive cruise control, for example, a tool that can self-modulate speed and braking in highway traffic. On an interstate highway drive last summer, in a car with such a feature, I easily drove over 60 miles without my foot ever once touching the gas or brake pedals. The car braked smoothly when I came up behind someone going more slowly, and then it sped back up once it had some distance. I didn’t have to do a thing, besides keep my hands on the wheel. Put simply, it rocked.
Sumwalt, to his credit, did not simply lay the blame on consumers. He also chided the National Highway Traffic Safety Administration (NHTSA) for failing to lay out clear rules on implementing and using tools like Tesla Autopilot and the many other advanced driving assistance technologies now offered in new cars from all the major carmakers. One NTSB board member noted that the NHTSA appeared to focus on making sure that innovative technologies can get into the driveways of consumers as quickly and cheaply as possible rather than safety.
Since it was introduced in 2014, Tesla’s Autopilot feature has offered a taste of a dream of what self-driving cars can one day accomplish. Prospective Tesla buyers have the option of paying additional money for a car pre-installed with more advanced “Full Self-Driving Capability,” referring to hardware components that will one day be used when the software of automated driving catches up (This perceived advantage in the future is, when you boil it down, also the strongest part of a flimsy case for Tesla’s sky-high stock price). Though Autopilot ostensibly requires drivers to keep their hands on the wheel at all times, it can be gamed fairly easily, as evidenced by the many viral videos of people (including Elon Musk himself) using it improperly.
This mixed messaging about how Autopilot should be used (it is, after all, literally called Autopilot) also came under scrutiny at the Tuesday meeting. “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars,” Sumwalt said.
The mass adoption of tools like adaptive cruise control, however, does not rock. Another study from AAA, released last year, found that drivers using such systems were “nearly twice as likely to engage in distracted driving.” That is, they were lulled into a false sense of security, and perhaps began to play Candy Crush or figure out which podcast they should queue up next.
Car owners themselves praise such features, and say that they have prevented them from getting into accidents, and the self-driving carmakers (especially Tesla) have long claimed that these advanced tools reduce collisions in aggregate. In the view of Congressional Democrats and the NTSB (which can only issue policy recommendations for the NHTSA to adopt), this amounts to using humans as “test dummies” for autonomous driving technology, as put by Sen. Ed Markey, who has called for a ban on Autopilot’s usage until Tesla figures out a way to stop people from tricking Autopilot into thinking they’re focused on the road.
Predictably, Tesla disagrees. “Autopilot can greatly enhance occupant safety, but… the driver is ultimately responsible for the safe operation of his vehicle," Tesla responded to Markey in a statement to VICE last month. The company has long cited figures showing that the collision rates of drivers using Autopilot is lower than the norm, but there are good reasons to be skeptical of Tesla’s internal data.
Many of the new driver assist technologies being introduced, including adaptive cruise control, can absolutely make the road a safer place. They can also make driving, which is often quite unpleasant, a little less unpleasant. Automatic parallel parking, automatic detection (and correction) of drifting into other lanes — these are relatively benign and obvious improvements. But as the NTSB determined on Tuesday, other corporate incentives have gotten in the way.
At the end of the Tuesday meeting, Robert Sumwalt called on NTSB staff to issue their nine recommendations from their report on the Bay Area Tesla crash. They were mostly requests for Tesla, specifically, to comply with NTSB requests for cooperation, and for the NHTSA to expand its regulation of self-driving cars and the ways that autonomous features are introduced. There are no penalties the NTSB can issue for non-compliance. And if the recent past is any indication of the near-ish future, there’s not much reason to expect either thing to happen.