Get the FREE DIGITAL BOOK: The Case for Killer Robots
Mind Matters Reporting on Natural and Artificial Intelligence

Search Resultstrolley problem

Streetcar in Toronto, Ontario, Canada

The “Moral Machine” Is Bad News for AI Ethics

Despite the recent claims of its defenders, there is no way we can outsource moral decision-making to an automated intelligence

Here’s the dilemma: The Moral Machine (the Trolley Problem, updated) feels necessary because the rules by which we order our lives are useless with automated vehicles. Laws embody principles that we apply. Machines have no mind by which to apply the rules. Instead researchers must train them with millions of examples and hope the machine extracts the correct message… 

Read More ›
Businessman with psychopathic behaviors

All AI’s Are Psychopaths

We can use them but we can’t trust them with moral decisions. They don’t care why

Building an AI entails moving parts of our intelligence into a machine. We can do that with rules, (simplified) virtual worlds, statistical learning… We’ll likely create other means as well. But, as long as “no one is home”—that is, the machines lack minds—gaps will remain and those gaps, without human oversight, can put us at risk.

Read More ›
The difference between right and wrong

Will Self-Driving Cars Change Moral Decision-Making?

It’s time to separate science fact from science fiction about self-driving cars

Irish playwright John Waters warns of a time when we might have to grant moral discretion to computer algorithms, just as Christians now grant to the all-knowing but often inscrutable decrees of God. Not likely.

Read More ›
End of the road. Precipice, indicated by signs. 3d render

There is no universal moral machine

The “Moral Machine” project aimed at righteous self-driving cars revealed stark differences in global values

Whatever the causes of cultural differences, Brendan Dixon thinks that the Moral Machine presents mere caricatures of moral problems anyway. “The program reduces everything to a question of who gets hurt. There are no shades of gray or degrees of hurt. It is, as is so often with computers, simply black or white, on or off. None of the details that make true moral decisions hard and interesting remain.”

Read More ›
Cafeteria tables
Cafeteria tables

How Can AI Help Us With What We Care About?

Instead of making us part of things we don’t care about?

Despite the misguided hype, AI is just another tool. So it is encouraging to read about the ways that Japanese firm Hitachi is using AI as a tool to provide services that would otherwise be difficult or unavailable.

Read More ›