Mind Matters Natural and Artificial Intelligence News and Analysis
Smart car (HUD) and augmented reality navigation technology concept. Empty cockpit in vehicle and Self-Driving mode car graphic screen. Digital matrix blue tone.
Smart car (HUD) and augmented reality navigation technology concept. Empty cockpit in vehicle and Self-Driving mode car graphic screen. Digital matrix blue tone.
Adobe Stock Licensed

Death Spurs Demand for More Oversight of Self-Driving Cars

The National Transportation Safety Board seeks uniform standards for the previously voluntary information provided by carmakers

Despite the hype and a few bad actors, here at the Walter Bradley Institute, we believe in AI. Some of our Fellows have made major contributions to its development. But, while we are not Luddites, neither are we doe-eyed believers in “all things AI.”

That’s why we pay so much attention to oversight. The deployment of any technology—dams, bridges, buildings—requires care and, at times, oversight. Not to slow down progress but to protect us from ourselves. As the great scientist Richard Feynman put it, the easiest person to fool is oneself. We certify and license engineers, doctors, and even beauticians (though I cannot figure how a bad haircut endangers my life).

So, pause for a moment…

Would you drive on freeways filled with unlicensed teenagers?

Would you drive on that same freeway filled with unvalidated self-driving cars?

I would avoid both. Unfortunately, we’re close to the latter, as a recent Wired report on self-driving cars makes clear:

Exactly how many vehicles are testing, where they’re doing it, and how those cars are performing is mostly anyone’s guess. In many states, companies experimenting with autonomous vehicles don’t have to specify, and the federal government doesn’t keep track either.

Aarian Marshall, “Who’s Regulating Self-Driving Cars? Often, No One” at Wired

Part of the problem arises from how federal and state laws intersect. “Technically, regulating the testing of self-driving vehicles falls to the states. The federal government deals with stuff related to vehicle design” (Marshall, Wired)) Tech firms exploit this gap. Arizona lies at the wild frontier—which explains why so much testing occurs there; other states are more stringent.

But no state demands even minimal evidence beyond what the companies voluntarily supply, nor does the National Highway Traffic Safety Administration (NHTSA). The NHSTA’s guidance reads like Waymo promotional material:

Automation technologies are new and rapidly evolving. The right approach to achieving safety improvements begins with a focus on removing unnecessary barriers and issuing voluntary guidance, rather than regulations that could stifle innovation.”

NHTSA, “Preparing for the Future of Transportation“/cite>

The National Transportation Safety Board (NTSB) recently met to release its findings on the Uber self-driving car accident that killed Elaine Herzberg in March 2018.

One NTSB official summarized the current position:

“In my opinion, they’ve put technology advancement here before saving lives,” said NTSB member Jennifer Homendy, referring to the National Highway Traffic Safety Administration, which regulates motor vehicle safety. “There’s no requirement. There’s no evaluation. There’s no real standards issued.”

Aarian Marshall, “Who’s Regulating Self-Driving Cars? Often, No One” at Wired

Senator Tom Udall (D, New Mexico) observed at the subsequent meeting of the Senate Commerce Committee: “While I appreciate the potential benefits of autonomous vehicles, I remain concerned that humans will be used as test dummies.”

Self-driving car companies have consistently pushed against any mandatory oversight. But, to use a well-worn adage, that’s letting the wolf guard the hen house.

The NTSB agrees. They recommended that the NHSTA “force” companies to provide “their approach to safety, and to develop standards for evaluating that information.”

This is reasonable; it is not arduous. And it may very well save lives.

Unfortunately, it’s too late for Elaine Herzberg.

If you enjoyed this piece, you may want to look at some of Brendan Dixon’s other recent thoughts on self-driving cars:


Would selling self-driving cars sooner save lives? Not if we look more closely at the statistics. It’s enough to make you want to run out and buy a smart car today. But just a minute. There are other statistics out there. Let’s look at some of them.

Will industry pressure loosen self-driving car tests? Right now, the regulatory agency is under pressure to accept the industry’s “softball” testing suggestions.

and

Are self-driving cars really safer? A former Uber executive says no. Before we throw away the Driver’s Handbook… Current claims that self-driving cars are safer are hype, not measurement. Meanwhile, Congress is expected to push for legislation next month to pave the way for widespread use of self-driving vehicles without a consensus on safety standards.


Brendan Dixon

Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Brendan Dixon is a Software Architect with experience designing, creating, and managing projects of all sizes. His first foray into Artificial Intelligence was in the 1980s when he built an Expert System to assist in the diagnosis of software problems at IBM. Since then, he’s worked both as a Principal Engineer and Development Manager for industry leaders, such as Microsoft and Amazon, and numerous start-ups. While he spent most of that time other types of software, he’s remained engaged and interested in Artificial Intelligence.

Death Spurs Demand for More Oversight of Self-Driving Cars