Mind Matters Natural and Artificial Intelligence News and Analysis
relaxed man in autonomous car. self driving vehicle. autopilot. automotive technology.
Featured image: Autopilot/metamorworks, Adobe Stock

Should Tesla’s Autopilot Feature Be Illegal?

A recent study from the United Kingdom suggests that maybe it should

An open-access study of UK drivers found that drivers who used “‘conditional automation’ cars capable of self-driving on motorways and in traffic jams” were less competent when they took the wheel:

Researchers Professor Gary Burnett, Dr David Large and Dr Davide Salanitri found that the driving after the participants took back control of the car was poor, swerving across lanes and varying their speed during the 10 seconds following the handover.

On the first day of the study, drivers went off course by an average of two metres…

Even at the end of the week, nearly half of drivers had to look at the floor to make sure their feet were on the right pedals when asked to take control of the car.

Press Association, “Using driverless car may make you worse driver, study suggests” at Breaking News

Let’s step away from cars and Tesla’s Autopilot for a moment to review the impact cell phones have had on driving safety.

It is difficult to get an accurate count of collisions due to distracted driving—which usually means driving while sending or reading text messages. That’s because, according to the National Safety Council, roughly half of all states do not record whether texting impacted the driver(s) involved in a crash and two-thirds of all states do not record whether the driver was using a hands-free cell phone. And no state records whether driver-assist technologies—such as Tesla’s Autopilot—were a factor. As a result, the NSC believes that the number of traffic mishaps attributable to distracted driving is undercounted.

But even this partial data is strong enough to drive nearly all states to ban sending or receiving text messages while driving. Many states ban all cell phone use while driving.

Safe driving demands attention; changing conditions can require responses within a fraction of a second to prevent collisions and injuries. Distractions, even those that redirect driver attention for just a few seconds, make mishaps much more likely.

It was this awareness of the need for constant attention due to suddenly changing conditions that caused the University of Nottingham researchers to wonder how the “conditional automation” cars—like Tesla’s Autopilot—might affect driving. Their findings led them to conclude that

… drivers are likely to have become ‘out of the loop’, i.e. they have not been required to actively monitor, make decisions about or provide physical inputs to the driving task. ’This reduces their perception and comprehension of elements and events in their environment, and their ability to project the future status of these things – their so-called situational awareness’

Press Association, “Using driverless car may make you worse driver, study suggests” at Breaking News

Semi-automated driving features degrade the quality of driving. Last year, Consumer Reports gave Autopilot high marks for its “capabilities and ease of use,” but gave it the lowest rating for keeping drivers engaged. Nearly half of the drivers surveyed in another study believed it was safe to let go of the steering wheel when using Autopilot.

Tesla may tell drivers that Autopilot is not the same as a self-driving car. But, just as cell phone manufacturers’ warnings not to “text and drive” are too often unheeded, so too may be Tesla’s warnings.

Perhaps it’s time to make Tesla’s Autopilot illegal. We outlaw other distracted driving technologies; why not Autopilot? The current data suggests that we should.

More by Brendan Dixon on inflated self-driving car claims:

Even Uber didn’t believe in Uber’s self-driving taxis We found that out after Google’s Waymo sued the company

True believer loses faith in fully self-driving cars Levandowski sees the future—and it is tech aids for safer driving

The real future of self-driving cars is — better human drivers! Manufacturers are improving safety by incorporating warning systems developed for self-driving cars into conventional models


Autopilot is not just another word for “asleep at the wheel” As a recent fatal accident in Florida shows, even sober, attentive drivers often put too much trust into Tesla’s Autopilot system, with disastrous results

Featured image: Autopilot/metamorworks, Adobe Stock

Brendan Dixon

Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Brendan Dixon is a Software Architect with experience designing, creating, and managing projects of all sizes. His first foray into Artificial Intelligence was in the 1980s when he built an Expert System to assist in the diagnosis of software problems at IBM. Since then, he’s worked both as a Principal Engineer and Development Manager for industry leaders, such as Microsoft and Amazon, and numerous start-ups. While he spent most of that time other types of software, he’s remained engaged and interested in Artificial Intelligence.

Should Tesla’s Autopilot Feature Be Illegal?