Very few of Waymo’s most serious crashes were Waymo’s fault
I looked at 45 major Waymo crashes—most were human error.
Everything was fine until the wheel came off.
At 1:14 AM on May 31st, a Waymo was driving on South Lamar Boulevard in Austin, Texas, when the front left wheel detached. The bottom of the car scraped against the pavement as the car skidded to a stop, and the passenger suffered a minor injury, according to Waymo.
Among the 45 most serious crashes Waymo experienced in recent months, this was arguably the crash that was most clearly Waymo’s fault. And it was a mechanical failure, not an error by Waymo’s self-driving software.
On Tuesday, Waymo released new safety statistics for its fleet of self-driving cars. Waymo had completed 96 million miles of driving through June. The company estimates that its vehicles were involved in far fewer crashes than you’d expect of human drivers in the same locations and traffic conditions.
Waymo estimates that typical human drivers would have gotten into airbag-triggering crashes 159 times over 96 million miles in the cities where Waymo operates. Waymo, in contrast, got into only 34 airbag crashes—a 79 percent reduction.
In a similar manner, Waymo estimates that its vehicles get into injury-causing crashes (including that detached wheel crash) 80 percent less often than human drivers. Crashes that injure pedestrians were 92 percent less common, while crashes that injure cyclists were reduced by 78 percent relative to typical human drivers.
Can we trust these statistics? In the past, experts have told Understanding AI that Waymo’s methodology is credible. Still, it’s worth being skeptical any time a company publishes research about the performance of its own product.
So for this story I wanted to judge the safety of Waymo’s vehicles in another way: by looking at the crashes themselves. Waymo is required to disclose every significant crash to the National Highway Traffic Safety Administration (NHTSA). This week, the agency published a new batch of crash reports that cover crashes through August 15. For this story, I took a careful look at every Waymo crash between mid-February (the cutoff for our last story) and mid-August that was serious enough to cause an injury or trigger an airbag.
Waymo’s vehicles were involved in 45 crashes like that over those six months. A large majority of these crashes were clearly not Waymo’s fault, including 24 crashes where the Waymo wasn’t moving at all and another 10 where the Waymo was rear-ended by another vehicle.
For example, in April, a Waymo was stopped at a traffic light in Los Angeles behind a pickup truck that was being towed. The pickup truck came loose, rolled backwards, and hit the Waymo. It seems unlikely that Waymo’s self-driving software could have done anything to prevent that crash.
Over this six-month period, there were 41 crashes that involved some kind of driver error—either by Waymo’s software or a human driver. As we’ll see, Waymo’s software was clearly not at fault in a large majority of these crashes—in only a handful of cases did flaws in Waymo’s software even arguably play a role.
There were four other crashes that didn’t involve driving mistakes at all. I’ve already mentioned one of these—a Waymo crashed after one of its wheels literally fell off. The other three were cases where an exiting Waymo passenger opened a door and hit a passing bicycle or scooter. There may be steps Waymo could take to prevent such injuries in the future, but these clearly weren’t failures of Waymo’s self-driving software.
Waymo’s safety record since February
Of the 41 crashes that involved driving mistakes, 37 seemed to be mostly or completely the fault of other drivers.
24 crashes happened when another vehicle collided with a stationary Waymo, including 19 rear-end collisions. A representative example: a Waymo “came to a stop for a red light at the intersection” at 1:24 AM on March 16 in Los Angeles. The car behind the Waymo didn’t slow in time, and “made contact” with the Waymo. A passenger in the other car suffered a “minor injury,” according to Waymo. Non-rear-end crashes in this category include that pickup truck rolling backwards, and a time where the car ahead of a Waymo turned left, got hit by another vehicle, and got pushed back into the stationary Waymo.
7 crashes involved another vehicle rear-ending a Waymo in motion. In 5 of these the Waymo may as well have been stopped: the car was going less than 3 miles per hour ahead of an intersection. The Waymo was traveling faster in the other 2 crashes, but in both cases, the car from behind seems mostly or completely responsible.
In 4 crashes, another car entered Waymo’s right of way. There were 2 cases of turning vehicles cutting the Waymo off. In the other 2 cases, another car hit a different object (a parked car in one case, a streetlight in the other) and then careened into the Waymo’s path.
2 crashes happened when a bike or car hit a Waymo in an intersection. For instance, a San Francisco biker ran into the side of a Waymo passing through a four-way stop. The bike had been traveling on the sidewalk, which was blocked from the AV’s view by vegetation. The biker fell to the ground and suffered what Waymo described as a “minor injury.”
Three crashes involved another vehicle sideswiping a Waymo as it drove past:
In Atlanta in June, a Waymo came to a stop to yield to a heavy truck traveling in the opposite direction. The road had “vehicles parked at both curbs,” which was apparently too tight of a squeeze. The truck hit the Waymo as it passed by.
Similarly, in San Francisco in August, a Waymo approached a “narrow portion” of a roadway with vehicles “parked at the curb in both directions.” Again, the Waymo yielded to a heavy truck traveling in the opposite direction. As it passed the stopped Waymo, the truck started to turn left and clipped the Waymo.
A Waymo in San Francisco wanted to merge into the right lane, but a car ahead was stopped. So the Waymo stopped “partially between two lanes.” A car behind the Waymo tried to drive around, scraping the Waymo’s side in the process.
In all three of these cases, a passenger in the robotaxi informed Waymo of a “minor injury.” In the second case, the Waymo passenger was taken to the hospital by ambulance. In all three cases, it seems clear that the other driver bore some responsibility, but perhaps Waymo’s self-driving software could have handled the situation better.
The most ambiguous case occurred in Phoenix in May, when a cat crossed a street ahead of a Waymo. The Waymo braked, but couldn’t avoid hitting the cat, “which subsequently ran away.” The sudden braking caused several cars to rear-end the Waymo and each other. There were no (human) injuries, but an airbag was triggered.
Should Waymo have detected the cat earlier, allowing it to stop more gently? It’s hard to say given the information in Waymo’s report.
It’s worth mentioning one other crash: Last Sunday, a motorcyclist in Tempe, Arizona died after the motorcycle rear-ended a Waymo and then another car hit the motorcycle. We’re excluding it from the above tallies because it happened after August 15—and as a result we don’t yet have Waymo’s report to NHTSA. But Brad Templeton, a Forbes columnist and former Waymo advisor, wrote that Waymo was “apparently not at fault.”
Waymo had three more “dooring” injuries
A dooring incident occurs when a passenger opens a door into another vehicle's path of travel, usually a bike. Of the 45 most serious crashes, three were dooring incidents, but they accounted for two of the seven crashes that caused significant injuries.1
As we noted in March, Waymo has a “safe exit” feature, where the car alerts an exiting passenger to any incoming vehicles or pedestrians. The chime the car plays (recording), however, may not always be loud enough. A Reddit commenter pointed out that the warning message sounds similar to other innocuous notifications Waymo gives passengers.
A bicyclist who sued Waymo in June for a February dooring incident claimed that “there was no alert issued in the illegally parked car as according to the passengers,” as reported by the San Francisco Chronicle. (Waymo denied the claim).
This isn’t a Waymo-specific problem, though: Uber settled a similar lawsuit in 2023 for an undisclosed amount, likely north of one million dollars. These types of dooring accidents are a huge issue for cyclists generally, too. Other carmakers have developed similar warning systems.
In any case, none of these crashes are directly related to Waymo’s autonomous driving software.
By “significant” I mean crashes which Waymo classifies as resulting in moderate injuries, serious injuries, or hospitalization.
I don’t get why you always simp for AI, even when it literally endangers people’s lives. I can only assume you have some conflict of interest due a financial stake in the industry, and would appreciate if you disclosed such conflicts in your “reporting.”
It is great to see Waymo is doing so well. Then, it will continue to improve, while humans as a whole don't.
The AI architecture of Waymo cars also has very good lessons for the current AI wave and other applications. The predictions are done with neural nets, as those can handle a huge variety of inputs, but the actual control of steering and pedals is done with physics-based software, to guarantee correctness.