10 Comments
User's avatar
rakkuroba's avatar

I don’t get why you always simp for AI, even when it literally endangers people’s lives. I can only assume you have some conflict of interest due a financial stake in the industry, and would appreciate if you disclosed such conflicts in your “reporting.”

Expand full comment
Oleg  Alexandrov's avatar

The whole point of the article is that a passenger would experience less injury in a Waymo car than in in a human-driven car. Presenting the evidence is fair.

Expand full comment
Willy from Philly ButNotReally's avatar

What evidence do you have that they're "simp(ing) for Ai"? Are you asking the writer to prove your assumption? They're literally reporting on information that is publicly available, but doing so in a way that is easier to find than going through the NHTSA reports. Is this a HATE read for you? So weird.

Expand full comment
Ray Sarlo's avatar

Mate, you're reading "Understanding AI" what do you think you're going to read here?

Also, there's no simping here. It's reporting on actual incidents.

Expand full comment
Randy's avatar

It seems to me, this article was written as fairly and unbiased as possible. The stats are very clear; AI is much safer than humans.

Expand full comment
Kenny Easwaran's avatar

Which of the events do you think of as the AI endangering people’s lives? In every case it was a car endangering lives, and the AI was handling the car less dangerously than the existing cars on the streets. You should stop simping for humans in cars.

Expand full comment
Oleg  Alexandrov's avatar

It is great to see Waymo is doing so well. Then, it will continue to improve, while humans as a whole don't.

The AI architecture of Waymo cars also has very good lessons for the current AI wave and other applications. The predictions are done with neural nets, as those can handle a huge variety of inputs, but the actual control of steering and pedals is done with physics-based software, to guarantee correctness.

Expand full comment
toolate's avatar

"It wasn't my knife, it was an accident"

Expand full comment
Isaac King's avatar

> 24 crashes where the Waymo wasn’t moving at all and another 10 where the Waymo was rear-ended by another vehicle

> 7 crashes involved another vehicle rear-ending a Waymo in motion

These numbers seem to contradict each other. Was it 10 or 7?

Expand full comment
Ace Hoffman's avatar

While I am impressed with the low rate of accidents, I wonder how many of the accidents that technically were not Waymo's fault could have easily been avoided by any alert defensive driver? For example, when the car slipped off the tow-truck, was there room for the Waymo to back up even a little bit -- and did it try to, as much as possible? Or did it "realize" the futility and NOT back up to where it would hit a car behind it? OR did it calculate that doing so might reduce the chance of injury to their own passenger (although it would increase the likelihood of someone in a car behind getting hurt, through probably not as badly)? Lastly, when it IS WAYMO's fault, then whose fault is it? The AI programmer? The data checkers? The CEO? Nobody? Doesn't that make it a "no fault" accident somehow?

Expand full comment