Mind Matters Natural and Artificial Intelligence News and Analysis

TagTrolley problem

business-ethics-stockpack-adobe-stock.jpg
Business Ethics

Artificial Ethics May Make Poor Choices

Whether or not AI can become powerful enough to follow its own rules is still an open question

We’ve all heard about computers that make poor ethical choices. One of the most memorable is HAL 9000 in the 1968 classic, 2001: A Space Odyssey. In the film, HAL kills four humans and attempts to kill a fifth. The concurrently written book elaborates on HAL’s murderous plans, explaining that they were due to HAL’s inability to properly make the correct ethical choice: lie to the humans or kill them (and, thus, no longer be forced to lie to them). Poor HAL 9000! If only people had developed a new field of academic inquiry in time to help him (or should we say, “it”?) make better fictional ethical choices! Putting aside Hollywood’s imaginary universes, the real need for the new…

Streetcar in Toronto, Ontario, Canada

The “Moral Machine” Is Bad News for AI Ethics

Despite the recent claims of its defenders, there is no way we can outsource moral decision-making to an automated intelligence

Here’s the dilemma: The Moral Machine (the Trolley Problem, updated) feels necessary because the rules by which we order our lives are useless with automated vehicles. Laws embody principles that we apply. Machines have no mind by which to apply the rules. Instead researchers must train them with millions of examples and hope the machine extracts the correct message… 

Read More ›
The difference between right and wrong

Will Self-Driving Cars Change Moral Decision-Making?

It’s time to separate science fact from science fiction about self-driving cars

Irish playwright John Waters warns of a time when we might have to grant moral discretion to computer algorithms, just as Christians now grant to the all-knowing but often inscrutable decrees of God. Not likely.

Read More ›