27 Comments
User's avatar
In Theory's avatar

The question isn't whether or not they could get one car to drive in a straight line for a day... It's whether or not that launch would justify Tesla's stratospheric valuation and it's primary shareholder's repeated claims that there would be thousands of Tesla robotaxis bringing in a whole new growth stream. I live in Austin and Waymo is cruising around dominating. Tesla is so, sadly, far behind it's indescribable. The whole brand is utter bullshit and the hater posts you reference are the result of the hyperbole. Tesla does not deserve to catch a break. Musk has been over promising and under delivering for a decade now and your rationalizing the one flagship ride just suborns that.

Expand full comment
Timothy B. Lee's avatar

What are you seeing in Austin that makes you think Tesla is way behind?

Expand full comment
Oleg  Alexandrov's avatar

Tesla's engineering is just not serious. He stubbornly refuses to accept that lidar is a necessity. He does not have the fine-grained multi-domain modeling Waymo has, and believes enough video and neural nets will do the trick.

Expand full comment
Michael Sullivan's avatar

You compromise your credibility when you try to dismiss Tesla's robotaxis as "one car driving in a straight line for a day." Like, I'm not at all sure that Tesla is doing anything other than "What Waymo did 5 years ago" here (albeit without LIDAR). But it's not "one car driving in a straight line for a day."

Expand full comment
In Theory's avatar

Thats exactly what it was. To suggest it was anything more is just more kool aid drinking to prop up the share price. Tesla is badly lagging in this area. Maybe they can get the fictitious humanoid robots to drive them instead.

Expand full comment
Chief Justice of Nuremberg 2.0's avatar

The question is whether you can debunk Nikola Tesla. The PLAID A/C Induction Motor was patented in 1888. Remote Control in 1896. The Salts of Lithium batteries in 1899. So, it's not that Tesla is behind, it's that you are behind, by 137 years.

Expand full comment
Gregory Kusnick's avatar

One possible strategy Tesla might pursue is to offer vehicles at steep discounts to driving training schools. Training data harvested from these vehicles would presumably show a lot of newbie errors being corrected. This might go some distance toward closing the gap between imitation learning and reinforcement learning.

Expand full comment
Oleg  Alexandrov's avatar

I think the road ahead is harder than what Timothy says.

The error rate for self-driving cars has to be incredibly low.

Musk has consistently been unserious about how hard AI would be to pull off, and how important extra sensors are.

The current self-driving cars, even by Waymo, run on much engineering and a little bit of prayer. Waymo at least gets that, and tries to compensate for it, but Musk doesn't.

It is only a matter of time Tesla will have one serious blunder, like Cruise.

Expand full comment
Jim's avatar

I agree with your reservations, but I think Tim is explicit about how wide the error bars are. The Teslas drive pretty well (upper error bar), but have driven fairly few miles, and driving millions of miles accurately is the hard part, plus it really might be a mechanical Turk (lower error bar).

Expand full comment
Markos's avatar

Very good overview of the videos posted so far and great comparisons with Waymo. I would love to hear your thoughts on Tesla self-delivering a car (https://www.youtube.com/watch?v=GU16hXSSGKs). The trip included a highway part (as opposed to the robotaxi rides)

Expand full comment
Markos's avatar

"With about a dozen vehicles in its robotaxi fleet, Tesla has probably logged fewer than 10,000 driverless miles so far" - Tesla has been running a robotaxi service for employees only in Texas and California for probably a year now. They do have plenty of data.

Also, it is worth noting that FSD customers in the beta test ("Early Access") program have a UI to send all data on a specific incident back to Tesla for further examination. This allows Tesla to focus on edge cases more easily.

Expand full comment
AnthonyCV's avatar

>Some predicted that Tesla’s superior manufacturing capacity would allow it to quickly eclipse Waymo, the current industry leader.

More than Waymo, plausibly. But eclipse? For that, even if they were technologically ready, manufacturing capacity isn't the bottleneck. Federal regulations only allowing manufacturers to put 2500 self driving cars a year on the road are. Too bad no one from Tesla has ever been in a position to try to remove this kind of regulation holding back its business and technology... huh.

Expand full comment
TC's avatar

Great article. I support the pursuit of fully autonomous vehicles, but I still think it's wrong to allow these experiments to be conducted on public roads. None of us signed up to be a part of the testing, yet they're all around us. No informed consent.

No institutional review board. No bueno.

Expand full comment
Derek Tank's avatar

Autonomous vehicles have been granted legal authorization to be on the road by a duly elected state and federal government. No different than all of the quote dangerous human drivers that are legally on the road. If you have a problem with it, write your legislator.

Expand full comment
Rory's avatar

Every year we don’t have reliable driverless car technology continues the status quo, where human drivers are causing many preventable deaths every day.

Without testing on public roads, we would basically never have the data to build the technology to enable this transition. The opportunity cost would be many human lives.

Expand full comment
Harry Campbell's avatar

Great write-up and title - got me to click instantly lol!

I watched all of the Tesla 'misstep videos' last week but didn't go the extra mile and watch the positive ones like you did - that was a good idea to include those also. I'm with you, cautiously optimistic. There weren't any major issues but that should be table stakes for a small deployment of 11 (or so) vehicles. For reference, a full time Uber driver will do around 1,000 miles per week so it's safe to say that Tesla did around 11,000 miles so far. And as you note, humans go hundreds of thousands of miles without a serious accident so we shouldn't expect to see any serious accidents anytime soon for Tesla.

One thing I learned from my interview with Phil Koopman (30 year AV safety expert) was that humans will also go 100 million miles of driving without a fatality! So even though Waymo has no fatalities yet, they've also only done 20 million miles of paid rides so far, so they're in a similar spot..

Expand full comment
Malcolm Sharpe's avatar

Thanks for collecting those clips. The highlights (both good and bad) are interesting to see.

Expand full comment
Duhrew's avatar

Any positive videos from Tesla influencers of their rides while it was raining?

Expand full comment
Phil Levin's avatar

Really appreciate your coverage of self-driving taxis. Your reporting has been consistently sharp and insightful.

Just a quick note on the last two videos in the section about not seeing serious mistakes - they appear to be from New York City, not Austin. The video descriptions mention NYC, and you can spot yellow cabs and other NYC landmarks in the footage. It looks like the videos show people operating Tesla's autonomously without drivers (illegally) rather than true robotaxis in Austin. Thought it was worth flagging if helpful.

Expand full comment
Timothy B. Lee's avatar

Ugh, you are right. Frustrating. I was fooled by the mention of robotaxi in the title and the fact that no one was in the driver's seat.

Expand full comment
Adam Hartung's avatar

Thanks. Very well researched and written. Great thinking about both conclusions and limits to the data available to you. Very well done. Thank you

Expand full comment
werdnagreb's avatar

> In other words, if you rely too much on imitation learning, you can end up with a model that drives like an expert human most of the time but occasionally makes catastrophic mistakes.

I think this is key. We see this with LLMs as well. Most LLMs behave wonderfully in short spurts but behave more erratically as the context window gets larger.

A mixture of different kinds of models and learning is the only way self driving cars are ever really going to become expert.

Expand full comment
Daniel Reeves's avatar

This is so well done. And so much calmer and more reasonable sounding than my version -- https://agifriday.substack.com/p/turkla -- even though I think we're ultimately reaching the same (tentative) conclusions.

Expand full comment
Michael Spencer's avatar

From a deep learning perspective, even Wayve's approach makes more sense than Tesla's. Very surprised on your position on this. Customers spent serious money for FSD capabilities many years ago that never manifested.

I won't even go into the details of this limited supervised Austin test pilot and why I disagree with you.

Expand full comment
Timothy B. Lee's avatar

Why not go into the details? I'm interested.

Expand full comment
Chief Justice of Nuremberg 2.0's avatar

Tesla 2024 Impact Report ft Fossil Fuel Disinfo w/ Daily News July 2025 https://teslaleaks.com/f/tesla-2024-impact-report-ft-fossil-fuel-disinfo-w-news-july-2025

Expand full comment