Human drivers are to blame for most serious Waymo collisions
I took a careful look at Waymo’s latest crash data.
It’s Autonomy Week! This is the second of five articles exploring the state of the self-driving industry.
I’m a journalist with a master’s degree in computer science. Subscribe now to get future articles delivered straight to your inbox.
On a Friday evening last November, police chased a silver sedan across the San Francisco Bay Bridge. The fleeing vehicle entered San Francisco and went careening through the city’s crowded streets. At the intersection of 11th and Folsom streets, it sideswiped the fronts of two other vehicles, veered onto a sidewalk, and hit two pedestrians.
According to a local news story, both pedestrians were taken to the hospital with one suffering major injuries. The driver of the silver sedan was injured, as was a passenger in one of the other vehicles.
No one was injured in the third car, a driverless Waymo robotaxi. Still, Waymo was required to report the crash to government agencies. It was one of 22 crashes with injuries that Waymo has reported through June. And it’s the only crash Waymo has classified as causing a serious injury.
Twenty-two injuries might sound like a lot, but Waymo’s driverless cars have traveled more than 22 million miles. So driverless Waymo taxis have been involved in about one injury-causing crash for every million miles of driving—a much better rate than a typical human driver.
On Thursday, Waymo released a new website to help the public put statistics like this in perspective. Waymo estimates that typical drivers in San Francisco and Phoenix—Waymo’s two biggest markets—would have caused 63 crashes over 21.2 million miles.1 So Waymo vehicles get into injury-causing crashes less than one-third as often, per mile, as human-driven vehicles.
Waymo claims an even more dramatic improvement for crashes serious enough to trigger an airbag. Driverless Waymos have experienced just five crashes like that, and Waymo estimates that typical human drivers in Phoenix and San Francisco would have experienced 31 airbag crashes over 21.2 million miles. That implies driverless Waymos are one-sixth as likely as human drivers to experience this type of crash.
The new data comes at a critical time for Waymo, which is rapidly scaling up its robotaxi service. A year ago, Waymo was providing 10,000 rides per week. Last month, Waymo announced it was providing 100,000 rides per week. We can expect more growth in the coming months.
So it really matters whether Waymo is making our roads safer or more dangerous. And all the evidence so far suggests that it’s making them safer.
It’s not just the small number of crashes Waymo vehicles experience—it’s also the nature of those crashes. Out of the 25 most serious Waymo crashes, 17 involved a human driver rear-ending a Waymo. Three others involved a human-driven car running a red light before hitting a Waymo. There were no serious crashes where a Waymo ran a red light, rear-ended another car, or engaged in other clear-cut misbehavior.
Digging into Waymo’s crashes
In total, Waymo has reported nearly 200 crashes, which works out to about one crash every 100,000 miles. Waymo says 43 percent of crashes across San Francisco and Phoenix had a delta-V of less than 1 mph—in other words, they were very minor fender-benders.
But let’s focus on the 25 most severe crashes: those that either caused an injury, caused an airbag to deploy, or both. These are good crashes to focus on not only because they do the most damage, but because human drivers are more likely to report these types of crashes, making it easier to compare Waymo’s software to human drivers.
A large majority of these—17 crashes in total—involved another car rear-ending a Waymo. Some were quite severe: three triggered airbag deployments and one caused a “moderate” injury. One vehicle rammed the Waymo a second time as it fled the scene, prompting Waymo to sue the driver.
There were three crashes where a human-driven car ran a red light before crashing into a Waymo:
One was the crash I mentioned at the top of this article. A car fleeing the police ran a red light and slammed into a Waymo, another car, and two pedestrians, causing several injuries.
In San Francisco, a pair of robbery suspects fleeing police in a stolen car ran a red light “at a high rate of speed” and slammed into the driver’s side door of a Waymo, triggering an airbag. The suspects were uninjured and fled on foot. The Waymo was thankfully empty.
In Phoenix, a car ran a red light and then “made contact with the SUV in front of the Waymo AV, and both of the other vehicles spun.” The Waymo vehicle was hit in the process, and someone in one of the other vehicles suffered an injury Waymo described as minor.
There were two crashes where a Waymo got sideswiped by a vehicle in an adjacent lane:
In San Francisco, Waymo was stopped at a stop sign in the right lane when another car hit the Waymo while passing it on the left.
In Tempe, Arizona, an SUV “overtook the Waymo AV on the left,” then “initiated a right turn,” cutting the Waymo off and causing a crash. A passenger in the SUV said they suffered moderate injuries.
There were two crashes where another vehicle turned left across the path of a Waymo vehicle:
In San Francisco, a Waymo and a large truck were approaching an intersection from opposite directions when a bicycle behind the truck made a sudden left in front of the Waymo. Waymo says the truck blocked Waymo’s vehicle from seeing the bicycle until the last second. The Waymo slammed on its brakes but wasn’t able to stop in time. The San Francisco Fire Department told local media that the bicyclist suffered only minor injuries and was able to leave the scene on their own.
A Waymo in Phoenix was traveling in the right lane. A row of stopped cars was in the lane to its left. As Waymo approached an intersection, a car coming from the opposite direction made a left turn through a gap in the row of stopped cars. Again, Waymo says the row of stopped cars blocked it from seeing the turning car until it was too late. A passenger in the turning vehicle reported minor injuries.
The final crash involved a Waymo vehicle being hit hit on the passenger side as it made an unprotected left turn. Waymo says that the oncoming vehicle had been driving in a bike lane. It had been obscured by a line of cars in the main travel lane to the left of the bike lane, and hence the Waymo vehicle didn’t see it coming until it had already begun the left turn.
It’s conceivable that Waymo was at fault in these last three cases—it’s impossible to say without more details. It’s also possible that erratic braking by Waymo contributed to a few of those rear-end crashes. Still, it seems clear that a non-Waymo vehicle bore primary responsibility for most, and possibly all, of these crashes.
“About as good as you can do”
One should always be skeptical when a company publishes a self-congratulatory report about its own safety record. So I called Noah Goodall, a civil engineer with many years of experience studying roadway safety, to see what he made of Waymo’s analysis.
“They've been the best of the companies doing this,” Goodall told me. He noted that Waymo has a team of full-time safety researchers that publishes their work in reputable journals.
Waymo knows exactly how often its own vehicles crash because its vehicles are bristling with sensors. The harder problem is calculating an appropriate baseline for human-caused crashes.
That’s partly because human drivers don’t always report their own crashes to the police, insurance companies, or anyone else. But it’s also because crash rates differ from one area to another. For example, there are far more crashes per mile in downtown San Francisco than in the suburbs of Phoenix.
Waymo tried to account for these factors as it calculated crash rates for human drivers in both Phoenix and San Francisco. To ensure an apples-to-apples comparison, Waymo’s analysis excludes freeway crashes from its human-driven benchmark, since Waymo’s commercial fleet doesn’t use freeways yet.
Waymo estimates that human drivers fail to report 32 percent of injury crashes; the company raised its benchmark for human crashes to account for that. But even without this under-reporting adjustment, Waymo’s injury crash rate would still be roughly 51 percent below that of human drivers. The true number is probably somewhere between the adjusted number (67 percent fewer crashes) and the unadjusted one (51 percent fewer crashes). It’s an impressive figure either way.
Waymo says it doesn’t apply an under-reporting adjustment to its human benchmark for airbag crashes, since humans almost always report crashes that are severe enough to trigger an airbag. So it’s easier to take Waymo’s figure here—an 84 percent decline in airbag crashes—at face value.
Waymo’s benchmarks for human drivers are “about as good as you can do,” Goodall told me. “It's very hard to get this kind of data.”
When I’ve talked to other safety experts, they’ve been equally positive about the quality of Waymo’s analysis. For example, last year I asked Phil Koopman, a professor of computer engineering at Carnegie Mellon, about a previous Waymo study that used insurance data to show its cars were significantly safer than human drivers. Koopman told me Waymo’s findings were statistically credible, with some minor caveats.
Similarly, David Zuby, the chief research officer at the Insurance Institute for Highway Safety, had mostly positive things to say about a December study analyzing Waymo’s first 7.1 million miles of driverless operations.
I found a few errors in Waymo’s data
If you look closely, you’ll see that one of the numbers in this article differs slightly from Waymo’s safety website. Specifically, Waymo says that its vehicles get into injury crashes 73 percent less often than human drivers, while the figure I use in this article is 67 percent.
This is because there were four apparent classification mistakes in the raw data Waymo used to generate its statistics.
Each time Waymo reports a crash to NHTSA, it records the severity of injuries caused by the crash. This can be fatal, serious, moderate, minor, none, or unknown.
When Waymo shared an embargoed copy of its numbers with me early last week, it said that there had been 16 injury crashes. But when I looked at the data Waymo had submitted to federal regulators, it showed 15 minor injuries, two moderate injuries, and one serious injury, for a total of 18.
When I asked Waymo about this, the company said it found a programming error. Waymo had recently started using the “moderate” injury category, and had not updated the code that generated its crash statistics to count these crashes. Waymo fixed the error quickly enough that the published version of Thursday’s report showed 18 injury crashes.
But as I continued looking at the data, I noticed another apparent mistake: two crashes had been put in the “unknown” injury category, yet the narrative for each crash indicated an injury had occurred. One report said “the passenger in the Waymo AV reported an unspecified injury.” The other stated that “an individual involved was transported from the scene to a hospital for medical treatment.”
After initial publication of my article, the French writer Edouard Hesse discovered two more crashes where Waymo initially reported that there were no injuries, but subsequently amended its report to say a passenger had claimed injuries of unknown severity. These incidents were not included in Waymo's count of injury crashes.
We’ve notified Waymo about these apparent mistakes and they say they are looking into them. As I write this, the website still claims a 73 percent reduction in injury crashes. But it seems likely that those four “unknown” crashes were, in fact, injury crashes. So all of the statistics in this article are based on the full list of 22 injury crashes.
I think this illustrates that I come by my generally positive outlook on Waymo honestly: I probably scrutinize Waymo’s data releases more carefully than any other journalist, and I’m not afraid to point out when the numbers don’t add up.
Based on my conversations with Waymo, I’m convinced these were honest mistakes rather than deliberate efforts to cover up crashes. I was only able to identify these mistakes because Waymo went out of its way to make its findings reproducible. It would make no sense to do that if they were simultaneously trying to fake their own statistics.
Could there be other injury or airbag-triggering crashes that Waymo isn’t counting? It’s certainly possible, but I doubt there have been very many. You might have noticed that I linked to local media reporting for some of Waymo’s most significant crashes. If Waymo deliberately covered up a serious crash, there’d be a big risk that a crash would get reported in the media and then Waymo would have to explain to federal regulators why it wasn’t reporting all legally required crashes.
So despite the screwups, I find Waymo’s data to be fairly credible, and those data show that Waymo’s vehicles crash far less often than human drivers on public roads.
September 25, 2024: This article was updated to reflect two additional injury crashes that were discovered by Edouard Hesse after the article was initially published.
Subscribe now to get copies of future articles in your inbox.
Waymo has traveled 22 million miles total, but 800,000 of those miles (and one injury crash) were in Los Angeles. Because 800,000 miles isn’t enough for statistically significant comparisons, Waymo focused on crashes in San Francisco and Phoenix and excluded the LA crash and miles from its analysis. This article originally stated that Waymo had driven 22 million miles in just San Francisco and Phoenix during this time period.
Anyone who takes Waymo in SF can attest to it being a Zen experience, the way human drivers should drive. It drives without rushing, over accelerating and braking, yet arrives without delay while humans zig zag, dart, cut each other off, zone out at lights, honk, and worse.
Disappointingly, it's one of the incredibly useful innovations were probably going to have to wait for a new generation to adopt. The negative, and I believe paid off media, will not help.
I'm convinced that this will be *the* thing that drives self-driving technology adoption. When the day comes that this technology is incorporated into new consumer vehicles, I predict that insurance companies will incentivize using it. And when people are presented with a choice to either let the car drive itself, or override to manual mode but then have to pay 3x as much on their insurance bill, they will choose the former en masse.