Mind Matters Natural and Artificial Intelligence News and Analysis
vehicle navigation system.jpg
Autopilot self-driving car system with no human intervention. Close up cropped image of hand of African male driver browsing the internet using smartphone and touchscreen in futuristic autonomous car

Self-Driving Cars: Waymo Beats Tesla By Picking the Right Target

Trying to get the human out of the loop, as Musk proposes, becomes increasingly costly as the complexity increases

Full self-driving has been a contentious topic in the last few years. In 2016, Elon Musk started claiming that his cars had all the hardware needed to do full self-driving, and that the software would be there by 2019. You would be able to summon a car from across the United States and it would drive across the country, recharging as needed, to pick you up, no driver needed.

He has specifically indicated that he means Level 5 autonomy, which means that no driver is needed at all. The driver can sleep, watch a movie, or just hang out in the back seat. In fact, in 2016, he indicated that drivers were merely there for regulatory purposes.

Musk’s claims about full self-driving have continued apace. He led people to believe that they could eventually put their car on a “robotaxi network” which would earn money for them while they sleep. In fact, he had an investment round of 2 billion dollars using this premise.

As 2019 wound to a close and it became apparent that these were fairy tales, he started walking back what he meant by “full self-driving.” Instead of “full self-driving” meaning that “a car can drive itself,” Musk now means that it is in theory potentially possibly, maybe possible, that the car might be able to get from one point to another without interruption. Maybe.

In fact, the disclaimer warning on the full self-driving release notes tells drivers that it “must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent.” What’s interesting about all those warnings is that driver assistance packages are typically meant to make drivers’ jobs easier. They are meant to decrease the driver’s anxiety by taking care of certain operations automatically.

But now, with Tesla’s new “full self-driving,” Tesla is actually telling drivers that they have to be more cautious than usual. In fact, as you can see from the language, “full self-driving” actually demands more attention from the driver and makes the driving task more dangerous. Many of us are left wondering what automation is supposed to accomplish here. An increase in the numbers of nervous drivers?

In fact, according to the SAE autonomy levels, where Level 0 means “not even cruise control” and Level 5 means “fully autonomous,” this software does not even qualify as a Level 3. Level 3 systems do not require constant supervision, while Tesla’s “full self-driving” explicitly requires more supervision than is ordinarily required for normal driving.

The contrast to this is WayMo. Waymo has been specializing in Level 4 autonomy. Level 4 means that the car really does self-drive but only in specific geographies or circumstances. For instance, an automaker carefully maps out specific streets, verifies that the software is capable of navigating that street, and/or implements extra infrastructure to help the software properly navigate the area. Waymo’s success at this approach can be found imprinted on their steering wheel. While Tesla says, “you must always keep your hands on the wheel and pay extra attention to the road,” Waymo’s warning reads, “Please keep your hands off the wheel The Waymo Driver is in control at all times” (The Waymo Driver is the name of their software).

Autonomous vehicles that aim for this type of Level 4 autonomy have been consistently winning. As reported here, the first real-world Level 4 rollout occurred in mid-2019, with the release of Dailmer’s automated parking system. This system combined an intelligent car with an intelligent infrastructure to deliver a robust system that reduces the headaches in a congested underground garage.

The problem with Level 5, and with the companies who are aiming at it, are numerous. But some specific features stand out. The biggest one is that the roads are built for humans, not machines. It’s possible to build infrastructure that machines can navigate but in most places, it hasn’t been done yet. Roads and rules built for human decision-making are completely different from roads made for machines. The core problem comes in viewing machines as if they could potentially act like humans. They can’t. It’s a fundamental category mistake.

Prediction: Those who are directly aiming for Level 5 are going to lose. Those who are aiming for an ever-expanding Level 4 are going to win.

You may also enjoy: Guess what? You already own a self-driving car! Tech hype hits the stratosphere. When companies can’t deliver on their promises, they simply redefine the terms to mean whatever they can produce in time for a media cycle, often geared to annual conventions and the like. (Jonathan Bartlett)

Jonathan Bartlett

Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Jonathan Bartlett is a senior software R&D engineer at Specialized Bicycle Components, where he focuses on solving problems that span multiple software teams. Previously he was a senior developer at ITX, where he developed applications for companies across the US. He also offers his time as the Director of The Blyth Institute, focusing on the interplay between mathematics, philosophy, engineering, and science. Jonathan is the author of several textbooks and edited volumes which have been used by universities as diverse as Princeton and DeVry.

Self-Driving Cars: Waymo Beats Tesla By Picking the Right Target