Mind Matters Natural and Artificial Intelligence News and Analysis
modern-electric-car-rides-through-tunnel-with-cold-blue-light-style-3d-rendering-stockpack-adobe-stock
Modern Electric car rides through tunnel with cold blue light style 3d rendering

Tesla Recall Due to the Short Attention Span of Drivers

Tesla did nothing wrong, but some claim they didn’t do enough right.
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Tesla is conducting a recall of about two million vehicles sold in the United States.

Why?

The recall is mostly due to easily distracted drivers with short attention spans.

Tesla did nothing wrong, but some claim they didn’t do enough right. They followed standard design ethics in the development of their cars. On the other hand, Tesla’s marketing was misleading.

A lawsuit against Tesla and its Autopilot self-driving software was won by Tesla earlier this year. The jurors in the case found that the software wasn’t at fault. Contrary to instructions from Tesla, the driver was inattentive. In 2018, an Uber self-driving car killed a pedestrian because the driver was distracted by a streaming of the NBC show The Voice.

Any company that delivers engineered products should adhere to design ethics. Design ethics requires that a successfully engineered product should do what it was designed to do and nothing else. But the artificial intelligence in self-driving cars is complex and there is no way to anticipate all contingencies. Early self-driving cars mistook windblown plastic bags as rocks. Self-driving taxis in San Francisco were recently disabled by simply placing a traffic cone on the taxi’s front hood. Once an unanticipated problem is identified, it can usually be fixed by the designer. But as the complexity of a system increases linearly, the possible contingencies increase exponentially. There is no way to anticipate all of the things that can happen.

How then can safety be gauged? One approach is using the metric of guilt assessment at criminal trials. The engineered product should adhere to the principles of design ethics “beyond a reasonable doubt.”  

Tesla knows Autopilot software cannot respond to all occurrences at all times, so Tesla instructs its drivers to always pay attention while the car is moving in case something weird happens. By doing so, Tesla helps meet the design ethics threshold.

The decision to recall Tesla vehicles comes after a two-year investigation by the National Highway Traffic Safety Administration (NHTSA) into a series of accidents involving Tesla vehicles. But human drivers also are involved in accidents. Who is safer? What do the numbers say?

Business Insider reports: “In the fourth quarter of 2022, Tesla said its cars using Autopilot … were involved in …  0.2 crashes per million miles. That’s compared with around 0.7 crashes per million miles for Teslas not using Autopilot, and a US average of 1.5 crashes per million miles.” It seems that, on average, Tesla’s software is safer.

On the other hand, Tesla has exaggerated the performance of its Autopilot using seductive semantics often used in advertising. The name “Autopilot” gives pause. On an airplane, a pilot who engages autopilot can take a quick nap. Not so with the rules imposed on the Tesla driver.

Even more seductive is name given to Tesla’s advanced Full Self-Driving Capability software. This name gives the impression of performance at the highest level. Yes, there are different levels of self-driving car performance. The highest, Level 5, would allow a driver to key in any destination instructions and then read a book in the back seat. There are, as of yet, no universal Level 5 self-driving cars. The name Full Self-Driving Capability suggests otherwise.

At Level 5, self-driving cars will be able to traverse the winding single-lane country roads in rural West Virginia sung about by John Denver. These country roads are notched out of the sides of mountains. On one side is a rock wall, on the other a steep drop-off. I have driven on these one-lane roads where I meet a humongous fully laden logging truck coming towards me. We both slow to a crawl. I edge my car to the precipice of a cliff on one side while the logging truck hugs the mountain on the other. Our rear-view mirrors almost touching, we slowly make our way past each other. I cannot imagine a self-driving car driving on these one-lane roads. Maybe someday, but not in the immediate future.

On the other hand, self-driving cars operate at near Level 5 on well-defined environments like interstate highways. The white lines on either side of the road act as guides similar to a train’s railroad tracks. The parallel dashed lines on the road define lanes. The sensors on self-driving cars keep track of nearby vehicles to keep safe distances. And GPS aids in the assessing location. The environment is well defined and self-driving cars are able to drive fairly safely.

There is a wide spectrum of environments between the extremes of the country roads of West Virginia and an interstate highway. One of the roles of AI in self-driving cars is to assess the point where the software should be turned off and control turned over to the driver. In the extreme, self-driving software could be limited to operate in only the safest environments.

Besides further restricting environments, the new Tesla software will also give additional warnings to drivers easily distracted from monitoring their vehicles.  

Inattentive drivers deserve to be nagged.


Tesla Recall Due to the Short Attention Span of Drivers