For most people, self-driving cars are not real. They are theoretical devices hyped in the media and by investors hoping to cash in at some future date when the technology matures, a date which continues to be indefinitely postponed.
But for Brian Wiedenmeier, the distinction between self-driving fiction and reality is not so stark. "I encounter an autonomous vehicle on the street just about every day," Wiedenmeier told Motherboard. "I see them walking to the grocery store, biking to the park for some exercise."
Wiedenmeier lives in San Francisco, where hundreds of self-driving cars are being tested on the city's roads with the permission of state regulators. Currently, there are three companies doing most of the testing: Waymo, owned by Google's parent company Alphabet; Cruise, which is owned by General Motors; and Amazon's Zoox. In San Francisco, all three companies have safety drivers in the cars, although it's never clear to those outside the vehicle like Wiedenmeier whether the car is being driven at any given time by the human or computer.
Wiedenmeier has a unique perspective on the self-driving car issue in part because of where he lives, but also because of what he does. As the executive director of the San Francisco Bicycle Coalition, he leads an advocacy group fighting to make streets safer for pedestrians and cyclists. He says all three of the major testers have come to him asking for feedback on how their vehicles deal with vulnerable road users like bicyclists and pedestrians. This puts Wiedenmeier at the nexus of the promise of self-driving cars versus the reality.
The question of whether autonomous vehicles (AVs) can help make roads safer, especially for pedestrians and cyclists, is a deeply divisive one. On the one hand, safety has always been a key plank in the popular case for self-driving cars. The National Highway Traffic Safety Administration (NHTSA) estimates there were 38,680 deaths on U.S. roads last year. That is a higher figure than in previous years, in which NHTSA has regularly reported around 36,000 people dying on U.S. roads. Of the 2020 estimate, 6,236 were pedestrians and 891 were cyclists.
This vast annual toll on human life tends to be attributed to "human error," an attribution that predates the self-driving vehicle craze but is nevertheless tailor-made to hype it. NHTSA's own website on " Automated Vehicles for Safety" claims one of the main "Benefits of Automation" is safety, because "Automated vehicles’ potential to save lives and reduce injuries is rooted in one critical and tragic fact: 94% of serious crashes are due to human error."
The problem with this statistic is that it's a misrepresentation at best and flat-out wrong at worst. As Streetsblog previously reported, the number comes, ironically, from a 2015 NHTSA paper that surveyed crash reports from 2005 to 2007. The 94 percent number refers to the percentage of crashes for which drivers were the "critical reason" of the crash. But as the report itself states, this merely means it was "the last failure in the causal chain of events leading up to the crash" and "is not intended to be interpreted as the cause of the crash." Anyone who has ever been in a serious car crash knows they are often complex chains of events in which the last thing to happen is rarely the most important.
For example, consider some recent cyclist deaths on San Francisco streets. In March 2019, Tess Rothstein swerved to avoid an opening passenger door on Howard Street when she was hit and killed by a box truck. In May 2020, Devlin O'Connor was doored, thrown from his bicycle, and then run over and killed by a truck. In each of those cases, any reasonable observer would likely conclude the person carelessly opening a door into an oncoming cyclist was the main cause of the crash, but under the "critical reason" definition, it would not be. Instead, the "critical reason" would be whatever the reason why the driver of each truck did not avoid Rothstein or O'Connor. And while human carelessness is a key cause of both crashes, bad road design in which car doors open into active bike lanes is the main culprit.
This kind of overly simplistic logic—insensitive to the vagaries of traffic violence and the way streets are designed that turn predictable human carelessness into life-ending tragedies—is exactly the kind of logic that makes safe streets advocates and AV skeptics reject the idea that self-driving cars are a solution to traffic violence, especially given current technological limitations.
"While it is theoretically possible for AVs to be safer for vulnerable road users, we just are not there yet with the technology," said Missy Cummings, professor of engineering at Duke University. "I actually bike quite a bit, and even though humans are horrible drivers, I know what to expect from this set of bad drivers. At the present time, I have no idea what AVs might do and this is where the real problem lies."
But it is precisely the ways in which humans are horrible drivers that Waymo engineers see the benefit of their technology. "There's no drunk driving, there's no distracted driving, your autonomous vehicle isn't going to text and drive at the same time," said Waymo engineer Anne Dorsey. These behaviors certainly lead to erratic driving even the most seasoned cyclist couldn't anticipate. Yet, some crashes are avoided by pedestrians or cyclists yelling at drivers to warn them, making eye contact to signal intentions, and other human ways of communicating or offering visual cues, something autonomous vehicles aren't capable of.
This suspicion of self-driving cars is only heightened by the fact that the only known fatality from an autonomous vehicle was a pedestrian. In 2018, an autonomous Uber testing vehicle with an inattentive safety driver combined with a carelessly engineered self-driving car killed Elaine Hertzberg in Tempe, Arizona.
Hertzberg's death was a reckoning of sorts for the self-driving car industry; Uber gradually backed away from its efforts and ultimately sold the subsidiary off for a fraction of what it had poured into it. In the years since, the industry has rapidly consolidated and Waymo is widely regarded as the industry leader. It is currently the only company running a fully driverless taxi service, although it is limited to suburban-style roads outside of Phoenix.
Wiedenmeier says he has noticed a big change with how AV companies approach their testing since Hertzberg's death. He recalls being horrified by Uber's AV testing vehicles that often made rudimentary errors like blowing through red lights, making illegal turns, and failing to yield for pedestrians at crosswalks. But in recent years, he hasn't experienced anything like that with the current crop of AV testing, nor has he heard reports of anything similar in San Francisco. He describes the AVs on the road today as "more on the cautious side." Wiedenmeier thinks the main motivation for this caution is the billions of dollars at stake during this nascent period where public opinion about AVs is still being formed.
What little information the public has on AV performance seems to back up Wiedenmeier's observations. In October 2020, Waymo published road safety performance data from its trips in Arizona. In that report, Waymo disclosed just one event involving a "vulnerable road user": a pedestrian traveling at 2.7 miles per hour walked into the side of a stationary Waymo vehicle and no injuries were reported. But the report doesn't include any data from outside the Phoenix area where there might be more pedestrians and cyclists and Waymo declined to provide any update to the report.
But it's still not clear that even supervised autonomous vehicles are safer than human drivers. As dangerous as human drivers are, fatal crashes occur approximately once every 100 million miles travelled on the types of urban roads where vehicles and vulnerable road users typically interact, according to NHTSA. Waymo, which has done the most on-road testing of any autonomous vehicle company, has driven approximately 20 million driverless miles, meaning simply by the law of averages we still have a long way to go to learn if the "Waymo Driver," as the company likes its computer algorithm that controls the vehicles to be called, is any better than your average human driver in terms of fatality rates. (Despite its marketing practices, Tesla does not offer a self-driving product.)
Indeed, much of the divisiveness and suspicion around AVs revolves around the fact that everything we know about their safety is reported by the AV companies themselves. And the little data they're required to submit to regulators isn't particularly insightful. The California Department of Motor Vehicles requires all AV testing companies to submit annual reports about each time the autonomous system was disengaged for safety reasons, but each incident is self-reported and says nothing about what actually happened. For example, Waymo describes several disengagements as "Disengage for unwanted maneuver of the vehicle that was undesirable under the circumstances." Others are simply described as "Disengage for incorrect behavior prediction of other traffic participants." Cruise and Zoox use similarly unhelpful descriptions. In the absence of any neutral third party auditing safety records or taking other steps to certify their safety, people are left to their intuitions about how to feel about AVs.
As a result, Waymo appears to be engaging in a years-long campaign to win over the trust of these vulnerable road users, not just through press releases, but through each vehicle interaction. Dorsey said Waymo engineers don't just think about how to pass cyclists safely, but how to do so in a way that makes cyclists feel comfortable. When she moved to the Bay Area, Dorsey said she stopped biking regularly because the road design coupled with driver behavior made her uncomfortable. That experience made her realize that it's not just safety that matters, but comfort too.
"When we go past a cyclist, we might give them a little extra buffer than we really truly need to be physically safe. We might add some extra space in there because it might make them feel more comfortable and feel a little safer," Dorsey said. "We will choose where we drive relative to the cyclists to ensure that we're totally visible to them, that they're aware of our intention and they know what we want to do." By way of example, Dorsey talked about how Waymos are programmed not just to avoid right hooks—when a car goes past a cyclist only to make a right turn across their path—but to avoid positioning itself so cyclists are worried the car may execute a right hook.
I must admit, I am not an unbiased observer in this debate. My bicycle has been my main mode of transportation since I moved to Brooklyn in 2014. Since then, I have been doored once, hit twice, run off the road four times, and had more near-misses and close calls than I can possibly recount, a testament to the fact that the fatality data will only get you so far in measuring safety on U.S. roads. Drivers have told me to get onto the sidewalk where I "belong" (which would be illegal) and that designated bike lanes are in fact mis-labeled parking lanes (they are not). One man threatened to kill me for "almost" touching his Mercedes with my handlebars. Another told me, after he slammed on the brakes to avoid hitting me while I crossed the street when I had the light, that next time he wouldn't be so "kind." He was holding his cell phone in one hand as he told me this, the screen still glowing from a half-composed text.
Which is to say, I am reminded on a daily basis of the fallibility at best and malice at worst that human drivers exhibit. I know the first, best, and quite probably only solution to this pervasive problem is a robust and extensive network of protected bike lanes that separate me and other cyclists from multi-ton chunks of metal capable of accelerating to 60 miles per hour in the time it takes me to cross an intersection.
But I also know that is only a realistic possibility in dense urban areas and town centers. Pedestrians and cyclists are regularly hit and killed all over the country, including in areas where the problem is not a lack of protected bike lanes, but roads without sidewalks or crosswalks for miles where cars are allowed to go 50 or 60 miles per hour. Indeed, road safety is not just an urban problem. Approximately 45 percent of road fatalities last year occurred in rural areas (NHTSA doesn't break down pedestrian and cyclist deaths by area type). I thought about all this while Dorsey talked about the Waymo driver giving cyclists a little extra space for comfort. If nothing else, I thought that sounded nice.
I told Wiedenmeier about how conflicted I am about the potential for AVs to improve road safety combined with my skepticism that potential will ever be realized. So I asked him: if forced to choose between biking next to an autonomous vehicle right now or a randomly selected licensed human driver, which would he pick?
Wiedenmeier sighed, the kind only issued by people who have just been asked a question they hate. "If I was forced to be biking next to a vehicle, yeah, I would, I guess, I would slightly prefer an autonomous vehicle that's being tested," he concluded, "not because I trust the technology at this point. But because I know that the companies who have invested dollars into this testing and deployment are being extra careful, because all it would take is one collision."