Elon Musk: You Are Liable for My Malfunctioning Code!He hopes to put the blame for self-driving mishaps in parking lots on customers
Elon Musk has always defined the center of wild claims about self-driving cars. He has claimed that next year (2020) his company will have robotaxis on the road. Remember that, for Elon Musk, a robotaxi will have 100% autonomy; it can pick people up and drive them where they need to go. In fact, Musk has said that Tesla owners should consider their vehicles to be assets that appreciate in value: In the future, Tesla owners can add their cars to his taxi service and earn money while they sleep.
At Mind Matters News, we have criticized his approach on numerous grounds. One problem that keeps getting left out is, who assumes moral responsibility and legal liability for self-driving vehicles? As the autonomy Musk implements in his vehicles expands, he is apparently finally starting to understand the moral and liability implications of what he is doing. While it is better late than never, his response to recognizing his liability problem is quite disheartening. He is going to put the blame for his mistakes on his customers.
“Enhanced Summon” is Tesla’s name for a technology by which the cars unpark themselves and find their owner in a parking lot. The current implementation of this feature is ridiculous (it would be faster and easier to walk to your car in every situation). However, Musk says that new versions will be able to navigate tight, difficult parking lots.
There’s another problem with it, though. Despite the fact that it will be Musk’s software—and not the Tesla owner—doing the driving, Musk has said that the owner will be responsible for any damage caused by using his software:
Yes, but owner will have to accept tiny risk of damage. Those are very hard even for careful human drivers.— Elon Musk (@elonmusk) July 13, 2019
It seems that Musk has finally taken to heart what the Evangelical Statement of Principles of Artificial Intelligence says: “We deny that humans can or should cede our moral accountability or responsibilities to any form of AI that will ever be created.” Recognizing that he can’t cede his moral responsibility to his machines, he has apparently decided to redirect the moral responsibility for his own actions to his customers instead.
In short, Musk is promising to turn his owners’ overpriced cars into Robotaxis next year but when he hits a challenge in the software, he is claiming that the Tesla owners will be assuming all liability for damage caused by its failures. Are we ready for the physical, legal, and moral nightmares that such thinking will lead to? Is he?
Also: by Jonathan Bartlett: Need cash fast? Just pretend that you wrote software
Who assumes moral responsibility for self-driving cars? Can we discuss this before something happens and everyone is outsourcing the blame?
Further reading: Are Tesla’s robot taxis a phantom fleet? What’s behind Elon Musk’s sudden wild taxi adventure?
News from the real world of self-driving taxis: Not yet. WayMo includes a human in all their “robotaxis,” just in case, because the vehicles (at last report) were still confounded by common conditions. (Brendan Dixon)
The real future of self-driving cars is better human drivers. Manufacturers are improving safety by incorporating warning systems developed for self-driving cars into conventional models (Brendan Dixon)
Featured image: Self-driving car inside view/scharfsinn86, Adobe Stock