40 Comments
User's avatar
rakkuroba's avatar

I don’t get why you always simp for AI, even when it literally endangers people’s lives. I can only assume you have some conflict of interest due a financial stake in the industry, and would appreciate if you disclosed such conflicts in your “reporting.”

Expand full comment
Oleg  Alexandrov's avatar

The whole point of the article is that a passenger would experience less injury in a Waymo car than in in a human-driven car. Presenting the evidence is fair.

Expand full comment
Willy from Philly ButNotReally's avatar

What evidence do you have that they're "simp(ing) for Ai"? Are you asking the writer to prove your assumption? They're literally reporting on information that is publicly available, but doing so in a way that is easier to find than going through the NHTSA reports. Is this a HATE read for you? So weird.

Expand full comment
Ray Sarlo's avatar

Mate, you're reading "Understanding AI" what do you think you're going to read here?

Also, there's no simping here. It's reporting on actual incidents.

Expand full comment
Randy's avatar

It seems to me, this article was written as fairly and unbiased as possible. The stats are very clear; AI is much safer than humans.

Expand full comment
Kenny Easwaran's avatar

Which of the events do you think of as the AI endangering people’s lives? In every case it was a car endangering lives, and the AI was handling the car less dangerously than the existing cars on the streets. You should stop simping for humans in cars.

Expand full comment
Oleg  Alexandrov's avatar

It is great to see Waymo is doing so well. Then, it will continue to improve, while humans as a whole don't.

The AI architecture of Waymo cars also has very good lessons for the current AI wave and other applications. The predictions are done with neural nets, as those can handle a huge variety of inputs, but the actual control of steering and pedals is done with physics-based software, to guarantee correctness.

Expand full comment
Kshitij Parikh's avatar

Any reference for more information on what Waymo's approach is?

Expand full comment
toolate's avatar

"It wasn't my knife, it was an accident"

Expand full comment
Isaac King's avatar

> 24 crashes where the Waymo wasn’t moving at all and another 10 where the Waymo was rear-ended by another vehicle

> 7 crashes involved another vehicle rear-ending a Waymo in motion

These numbers seem to contradict each other. Was it 10 or 7?

Expand full comment
Kai Williams's avatar

Hi Isaac, thanks for pointing this out. The right number is 7 -- I've updated the article to fix the error in the introduction.

Expand full comment
Ace Hoffman's avatar

While I am impressed with the low rate of accidents, I wonder how many of the accidents that technically were not Waymo's fault could have easily been avoided by any alert defensive driver? For example, when the car slipped off the tow-truck, was there room for the Waymo to back up even a little bit -- and did it try to, as much as possible? Or did it "realize" the futility and NOT back up to where it would hit a car behind it? OR did it calculate that doing so might reduce the chance of injury to their own passenger (although it would increase the likelihood of someone in a car behind getting hurt, through probably not as badly)? Lastly, when it IS WAYMO's fault, then whose fault is it? The AI programmer? The data checkers? The CEO? Nobody? Doesn't that make it a "no fault" accident somehow?

Expand full comment
Andrew's avatar

Could the liability be with the company itself?

Expand full comment
Kai Williams's avatar

This is a really good question, and it's difficult to say for sure. (After all, for most of these, we only have Waymo's side of the story.) In the case of the tow-truck, I'm not sure how much more the Waymo could have done. The Waymo did start backing up, but there was a car behind, so it eventually had to stop.

I'm less sure about Waymo's liability's when it is at fault, and it probably depends on the circumstances of the crash.

Expand full comment
Tubthumping's avatar

So what? For years we were told the BS by SDC fluffers that AI driven cars would be accident free be because the AI would be superior to human drivers.

...then that lady in a bike was killed in Arizona. The fluffers stopped opining online with the aforementioned BS. They changed their tune to.."accidents will be human fault".

I see not much have changed.

Expand full comment
Dan Oblinger's avatar

No one credible ever said AI would be accident free. Show us the reference! no way. folks have been claiming for a while now that at least in the limited cases where they are driving, the are safer than humans. And that seem true, at least in their current limited usage. I do smell some BS here, but unless you show us your reference, I am pinning it on you.

Expand full comment
Tubthumping's avatar

"No one credible ever said AI would be accident free. Show us the reference!"

It was everywhere 10 - 15 yrs ago especially. You go look it up, not my problem you are ignorant. I am not going to waste my time digging up internet archives you can look up yourself.

Expand full comment
Sam Tobin-Hochstadt's avatar

One important thing to remember is that Elaine Hertzberg was killed by an Uber test vehicle, and the entire Uber self-driving program was shut down soon after. Waymo has never had any accident that serious.

Expand full comment
Tubthumping's avatar

Not yet. And it doesn't make the collective stance of the SDC Utopian (my term for them at the time) community at that time, either.

Expand full comment
Sam Tobin-Hochstadt's avatar

1. The people who said in 2017 that self-driving would be everywhere by 2020 were clearly wrong (regardless of claims about safety).

2. The people in 2025 saying that Waymo is much safer than human drivers in 2025 are clearly right.

Expand full comment
Tubthumping's avatar

Jury is still out for #2. All it will take is for it to have ONE Elaine Herzberg. That's it.

This is what I tried to tell the SDC Utopians before Herzberg, but they insisted that because those types of accident happen with regular cars my point made no sense.

They refused to believe that the rules were different. Then Herzberg happened.

SDCs and people do not mix well.

Expand full comment
Steve Newman's avatar

This is an extremely valuable analysis! It's great to have this reality check and put the data into further perspective. My takeaways:

- Waymo is probably even safer (compared to human drivers) than the raw statistics imply: not only do Waymos experience far fewer accidents, but the accident rate would be lower still if all of the other cars were also Waymo-caliber AVs

- Your observation that Waymo may be contributing to some incidents by behaving in unexpected / unhelpful ways (e.g. stopping in places where there's no room to pass; also I wonder if it might be stopping abruptly / unexpectedly and thus contributing to rear-end collisions?)

- Given all the sensors in a Waymo, there's likely low-hanging fruit for reducing issues relating to passengers exiting the vehicle

Expand full comment
Andrew's avatar

I’d love to see something done about “dooring”. I’m an urban biker and getting hit by a car door is what scares me most. Better sensors and alarms could certainly help here, and not only with Waymo’s.

Expand full comment
Willy from Philly ButNotReally's avatar

Completely agree. At the minimum, a louder warning, flashing lights, even preventing the door from opening should all be options on the table.

Expand full comment
Kevin Markham's avatar

Great article! Is it likely that you’ll be able to do a similar analysis for Tesla Robotaxis in the future?

Expand full comment
Timothy B. Lee's avatar

Depends on how soon they get rid of safety drivers and then how quickly they scale.

Expand full comment
Sam Tobin-Hochstadt's avatar

And whether the NHTSA lets them get away with hiding their data.

Expand full comment
Alex Quistberg's avatar

Thanks for sharing and examining Waymo safety! Do they also report on how often a human monitor took over the vehicle to avoid crashes? That would be important to analyze those as well to understand potential crashes avoided that may have been due to the autonomous system, potentially. It would also be better, in my opinion, if they compared Waymo Driver to the safest drivers rather than to all drivers, or even to the average driver.

Expand full comment
Timothy B. Lee's avatar

They did not say anything about remote interventions (these are all driverless vehicles so there were no interventions by safety drivers in the vehicle). I would of course be interested in data about this but I'm not sure it matters from a safety perspective. If they are reducing crashes by 80 percent that's good regardless of how much human labor is involved on the back end.

Expand full comment
Alex Quistberg's avatar

Thanks for the response! I agree that the safety benefit is impressive and important. If that is being achieved with minimal involvement from remote human monitors then that is even more impressive and would lend more support to the safety efficacy of the Waymo Driver vs humans. It would also be interesting to examine citywide, neighborhood or segment-level reductions in crashes where they are operating with an interrupted time series or similar quasi-experimental design, which would give more causal support to the claim of a reduction in crashes.

Expand full comment
Sam Tobin-Hochstadt's avatar

It's hard to compare to "the safest drivers" but they do compare to relatively new and expensive cars, which are empirically much safer (for multiple reasons). See https://storage.googleapis.com/waymo-uploads/files/documents/safety/Comparison%20of%20Waymo%20and%20Human-Driven%20Vehicles%20at%2025M%20miles.pdf

They also have a discussion of benchmarking relative to their driving mix in the "Human Benchmarks" section of https://waymo.com/safety/impact/ (you have to click to expand it).

Expand full comment
Alex Quistberg's avatar

Thanks for sharing those. It is difficult to compare to driver data, particularly similar data that has miles driven attached to it at the individuals-level, but possible with driver license data linked to driving violations, insurance data or naturalistic driving data. Maybe they have access to that from the partnership with Swiss Re, or it could be obtained from CA (or other state) DMV potentially. Each of these data have their advantages & disadvantages. Naturalistic driving data of humans would be best, there are some existing data via the SHRP2, but it is limited by how many drivers are included. Overall, I expect Waymo already or will eventually outperform even the safest human drivers.

Expand full comment
Sam Tobin-Hochstadt's avatar

On the "dooring" issue, it's in between "waymo's fault" and a general car issue. Dooring is made much worse by double-parking and exiting unexpectedly in the middle of the street, and that happens much more for ride-hailing services. In general the app-based driving market (uber, doordash, etc) as well as the massive expansion of delivery in general has not worked well with the existing design of cities because it's resulted in lots more people wanting to stop for very short times such that "parking" in the traditional sense is not reasonable time-wise. It's not obvious how to fix this and it's not something that waymo created, but waymo is also a ride-hailing company in addition to inventing AVs and so it's a situation they are contributing to.

Expand full comment
Michael Kaptein's avatar

In the insurance industry we obsess over determining who is at fault in an accident, especially when there are no independent witnesses. Among all insurance companies it is now accepted that the driver in the rear is always at fault. it doesn't matter if Waymo slammed on the brakes to avoid a cat, the driver in the rear could have chosen a larger gap to give themselves time to react to sudden stops.

Expand full comment
Ace Hoffman's avatar

The car that slipped off the tow-truck makes a fascinating example to consider the programming questions involved (to a programmer like myself, anyway!). In real life, in this instance, what would a human do? Probably try to back up as did the Waymo, apparently. But there wasn't enough room. It's quite possible the best decision would have been to NOT back up too much, just enough to let the car hit the front end of your car, and hold the brakes so that nothing else goes wrong. A human might be able to figure all that out -- despite its uniqueness -- in the instant it was happening, without ANY prior training. But of course most humans wouldn't, and some would back up furiously into the car behind, scared for their life, and others would be more thoughtful, depending on the precise circumstances (were they on an incline, for example? How close were they to begin with? When did Waymo "realize" it was happening?). Do Waymo vehicles stay further back from cars on tow-trucks than they used to? (I've always been a bit wary of such things, myself!)?

Expand full comment
Patrick A Plonski's avatar

Fascinating that Waymo braked for a cat! Makes me think of a long tail of things that aren't usually found in a roadway, but if you see them you really hope your vision system can correctly classify and act accordingly. Sometimes wild turkeys or geese will snarl traffic in my neighborhood, for example. What if you have a flock of turkeys, and also it's a construction zone, and then a ball rolls into the street. I bet they have specific simulations for unlikely but not impossible juxtapositions of special cases.

Expand full comment
Patrick A Plonski's avatar

by specific I mean, a test suite that procedurally generates zany predicaments

Expand full comment
Ace Hoffman's avatar

In the late 1970s, coming home in crawling traffic around rush hour, I saw a moose -- definitely different from, and larger than, a horse -- on the side of the Merritt Parkway in Connecticut, somewhere around Fairfield. That made no sense to me, but there it was. It has baffled me to this day.

Expand full comment
Josh's avatar

The trolley problem come to life!

Expand full comment
Daniel Gutierrez's avatar

HA! Very coincidentally, I took a Waymo for the very first time this morning. The car was "almost" involved in one of your categories: "car entered Waymo’s right of way." The Waymo was waiting to make a left turn and a huge double-long city bus made a tight right turn near the Waymo. The Waymo acknowledged the danger and started to back up out of the way of the bus which kept turning. The Waymo also started blowing its very loud horn. Fortunately, the bus did not make contact. Such is my Waymo initiation story!

Expand full comment