Mind Matters Natural and Artificial Intelligence News and Analysis
Photo by Devon Janse van Rensburg on Unsplash

Guess what? You already own a self-driving car

Tech hype hits the stratosphere
Picture of '58 Imperial - OFWO
vintage Chrysler Imperial 1958

Yes, the car you own today is probably a “self-driving” car and you may not know it. But that is because of the creative ways the term can be defined.

So much technology is hidden that we can’t easily distinguish between awesome technology and cool but insubstantial demos. Did a firm really invent a business-card reading system or did they just hire a bunch of typists to act as the “artificial” artificial intelligence? Is the awe-inspiring demo really operational, or is it really just a fancy PowerPoint?

Companies and industries often redefine terms to inflate their mediocre accomplishments into awesome sci-fi. Take, for instance, the “self-driving car.” What does the term mean to you? To most people, it means that all the people are in the back seat. They say, “Car, take us here” and the car does it. But that isn’t actually what the car industry means by the term.

The car industry has defined five levels of self-driving. A level-0 car does not have any self-driving features. But what about the other levels? At Forbes, tech entrepreneur Robert J. Szczerba offers thoughts on the classifications, some of which I have summarized, with comments.

In a Level-1 self-driving car, some aspect of the driving system is under computer control some of the time but the driver must be ready to take control at any time. Cruise control, developed in 1948, is an example of level-1 self-driving. Chrysler’s Imperial was the first car to implement cruise control in 1958, known back then as “Auto-pilot.” Thus, pretty much any manufacturer of cruise control technology could claim that the company’s cars are, to some extent, “self-driving.”

The Society of Automotive Engineers and many other websites refer to adaptive cruise control as an example of Level-1 capability. But I found nothing in the SAE definition of Level-1 that would preclude ordinary cruise control from falling within the definition. Most manufacturers don’t make such a claim for cruise control but that is probably so that they can build and maintain a mystique and thus charge more for other “self-driving” features.

NuTonomy self-driving electric Renault Zoe pilot tested in 2017

When most people hear the term “self-driving.” they have in mind a Level-5 car: “The car controls itself under all circumstances with no expectation of human intervention.” (Szczerba) Some companies are currently attempting Level 5 automation but none have achieved it. The closest approximation is self-driving taxis serving predefined routes, such as those pioneered in Singapore in 2016: “Only six autonomous vehicles — Renault Zoes and Mitsubishi i-MiEVs modified by MIT-spinoff NuTonomy — will be offering rides, and only then from predetermined pick-up and drop-off points within a 2.5-square-mile radius.” In short, they are like small buses on a closed route. They are not really Level 5 because the geographic area in which their software operates successfully is very small.

Other operating services still include drivers or engineers in the driver seat, for example, Waymo’s multi-year test program in Arizona:

Initially, passengers will be engineers and other company representatives, continuing an eight-year testing program with 3.5 million miles under its belt, the company said. But the next step, carrying members of the public, is coming fast – “in the next few months,” Krafcik said.

It will open up first to families in Waymo’s Early Rider program, who are already using its autonomous cars (with a just-in-case driver) as part of their routines, giving the company insights into everyday use. Michael Laris,”Waymo is taking the next step in the driverless car evolution” at Washington Post (November 7, 2017)

Just-released Tesla Model 3, dubbed “the car of the future”

Whatever else this is, it is not just around the corner. In any case, I hope you are starting to see what companies tend to do when they find out they can’t deliver on their promises—they simply redefine the terms to mean whatever they can produce in time for a media cycle, often geared to annual conventions and the like.

I have my doubts that anyone will be able to make fully self-driving vehicles work. It’s more plausible that future roads will be built specifically for self-driving vehicles. That is, the roads will be equipped with sensors, beacons, and protocols that allow self-driving vehicles to operate seamlessly. Cars won’t need to navigate our current roads; instead, we will create virtual railways for the new type of cars. Bill Dembski explains:

AI engineers tasked with developing automated driving but finding it intractable on the roads currently driven by humans might then resolve their dilemma as follows: just reconfigure the driving environment so that dicey situations in which human drivers are needed never arise! Indeed, just set up roads with uniformly spaced lanes, perfectly predictable access, and electronic sensors that give vehicles feedback and monitor for mishaps.

My colleague Robert Marks dubs such a reconfiguration of the environment a “virtual railroad.” His metaphor is spot on. Without such a virtual railroad, fully automated vehicles simply face too many unpredictable dangers and are apt to “go off the rails.” Marks, who hails from West Virginia, especially appreciates the dangers. Indeed, the West Virginia back roads are particularly treacherous and give no indication of ever submitting to automated driving. William Dembski, “AI’s Temptation to Theft Over Honest Toil” at Mind Matters

Significantly, some companies have been pulling back on their promises of full (Level-5) self-driving vehicles. Tesla, for example, has been promising full self-driving capability for, quite literally, years. For example, “A Model X drives on side roads and freeways with no human help, then parks itself.” (Engadget, October 2016) It promised, for several thousand dollars, an upgrade to full self-driving capabilities when such features came out. In other words, it was selling people on the idea that the capability is just around the corner. And, by selling people, I mean literally taking their money on a product that they haven’t gotten to work in the real world:

Elon Musk
Elon Musk

On Oct 19, 2016, Tesla/SpaceX CEO Elon Musk told media:

Basic news is that all cars exiting the factory have hardware necessary for Level 5 Autonomy so that’s in terms of Cameras, Compute Power, it’s in every car we make on the order 2,000 cars a week are shipping now with Level 5 literally meaning hardware capable of full self-driving for driver-less capability. Iqtidar Ali , “TRANSCRIPT: ELON MUSK’S AUTOPILOT 2.0 CONFERENCE CALL” at xAutoWorld

In recent days, Tesla has stopped taking orders for this upgrade service on all of their cars. If you buy a Tesla today, you now own a Tesla, full stop. Not necessarily a piece of the future.

Selling hype and delivering mediocrity is somewhat of a staple in the technology world. I have been in too many projects where the promises wound up significantly exceeding the delivery. I’ve heard software providers say, “Yes, this is a turn-key solution that requires very little custom development” when they actually mean “You’ll need a team of twelve developers working around the clock for 3 months to even have a chance to implement this right.”

At the end of the day, it’s easy for CEOs to grow famous saying impressive things about the future. It’s even easy to build a fancy demo model that works in controlled situations. The public reasons, “I would never make such bold claims if I weren’t certain of it,” and accepts statements about the future by famous people as the truth. Why would they lie?

As it turns out, something akin to lying has unfortunately come to be the norm in technology. When technology marketers speak, they speak about their fanciful dreams of the future. It is up to the individual to check whether that dream makes sense in the real world.

Jonathan Bartlett is the Research and Education Director of the Blyth Institute.

Also: by Jonathan Bartlett: Who built AI? You did, actually. Along with millions of others, you are providing free training data

“Artificial” artificial intelligence: What happens when AI needs a human I?

See also: Self-driving cars hit an unnoticed pothole. Something in nature likes to approach zero without really arriving…

Guess what? You already own a self-driving car