Mind Matters Natural and Artificial Intelligence News and Analysis
woman-passenger-sitting-in-the-backseat-and-selects-a-route-when-her-self-driving-car-rides-on-the-highway-stockpack-adobe-stock.jpg
Woman passenger sitting in the backseat and  selects a route when her self-driving car rides on the highway.

Tesla Continues to Walk Back Full Self-Driving Claims

In 2016, Tesla (TSLA) couldn’t tell enough people that its cars would soon drive themselves

In 2019, Tesla raised billions of dollars on the prospect of a fleet of a million robotaxis by the next year. However, starting on the Q3 2019 earnings call, CEO Elon Musk started walking back some of those claims.

To begin with, in that earnings call, Musk started saying that “feature complete” really just meant that the “City Streets” version would be operable, not that it could actually drive without assistance. A year later, in regulatory filings with the California DMV, Tesla said, “As such, a final release of City Streets will continue to be an SAE Level 2, advanced driver-assistance feature.” In the accepted terminology around levels of self-driving, truly self-driving vehicles are classed as SAE Level 5. Level 2 means that the company is merely offering a driver-assistance feature and the driver is required to be engaged at all times.

Yesterday, Tesla filed a quarterly report (10-Q) with the SEC. This report included some interesting language about self-driving that did not appear in any previous filing. Under “risk factors”, Tesla said,

For example, we are developing self-driving and driver assist technologies to rely on vision-based sensors, unlike alternative technologies in development that additionally require other redundant sensors. There is no guarantee that any incremental changes in the specific equipment we deploy in our vehicles over time will not result in initial functional disparities from prior iterations or will perform as expected in the timeframe we anticipate, or at all. [emphasis added and hat tip to Gordon Johnson for the lead]

In other words, Tesla thinks that the question of whether it can deliver an actual self-driving vehicle is more problematic now than before.

To be fair, many risks cited in documents such as these are unlikely to occur. However, this risk factor is more significant than the usual ones we encounter, for several reasons:

First, this qualification doesn’t concern external factors. Most cited factors are external — regulatory issues, government issues, parts availability issues, etc. This risk concerns Tesla’s own core technology, which it has been selling to customers for five years now. The risk concerns whether it can deliver the features that it has already sold.

For instance, in Microsoft’s (MSFT) risk factors section, the factors cited all relate to outside competition. Microsoft cites no risk factors as to whether it can deliver on software that it has promised, announced, or sold. The closest qualification on performance offered is a section on the risk of selling products with defects, not on the inability to deliver a product.

What’s also interesting is the fact that Tesla added this language recently. That may simply reflect a change in the mindset of its legal team. Alternatively, it may represent the fact that, as the software has developed, the number of edge cases has been increasing. We reported on this difficulty several years ago, pointing out that the edge cases likely increased with the square of the number of driving situations. Tesla’s pursuit of fully-self-driving Level 5 cars comes with that cost. A Level 4 system, where the autonomous vehicle operates in designated areas under specific conditions, can be highly controlled. As Tesla expands the number of use cases of self-driving, drivers will actually need to pay more attention to the car, as the chance of running into edge cases will go up.

It may be that early enthusiasm for Level 5 self-driving was motivated primarily by a misunderstanding of the relationship between humans and machines. Now that the project has matured, the inescapable engineering problems have arisen and confronted Tesla in more tangible ways. Perhaps the engineering team started out with visions of what they could do. But the reality of ever-increasing edge cases, combined with the fact that today’s road system is based as much on social cues as on logic, is causing the engineering team to hedge their bets.

In any case, the direction has been clear. From 2016 to 2019, full autonomy was just around the corner. You would soon be able to push a button and have your car drive itself from New York to Los Angeles to pick you up, finding charging stations along the way. Starting in late 2019, Tesla has been increasingly walking back this claim. Now, in its regulatory filing with the SEC, it has admitted that there is a chance that it might not be able to pull it off at all.


You may also wish to read: Elon Musk walks back full self-driving claims. His Q3 earnings call with investors was a stark contrast to earlier claims about a robotaxi fleet.


Jonathan Bartlett

Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Jonathan Bartlett is a senior software R&D engineer at Specialized Bicycle Components, where he focuses on solving problems that span multiple software teams. Previously he was a senior developer at ITX, where he developed applications for companies across the US. He also offers his time as the Director of The Blyth Institute, focusing on the interplay between mathematics, philosophy, engineering, and science. Jonathan is the author of several textbooks and edited volumes which have been used by universities as diverse as Princeton and DeVry.

Tesla Continues to Walk Back Full Self-Driving Claims