Mind Matters Natural and Artificial Intelligence News and Analysis

TagGödel's Second Incompleteness Theorem

hands brain model
Hands shaping brain model

Chalmers and Penrose Clash Over “Conscious Computers”

Philosopher Chalmers thinks computers could be conscious but physicist Penrose says no

Two authors I’ve been reading recently are Roger Penrose and David Chalmers. Penrose is a physics Nobel laureate who has stoked controversy by claiming in The Emperor’s New Mind: Concerning Computers, Minds and The Laws of Physics (1989) that the mind can do things beyond the ability of computers. Chalmers is a philosopher of science who claims in The Conscious Mind: In Search of a Fundamental Theory (1997) that consciousness cannot be reduced to physical processes. Both thinkers are well respected in their fields, even though they articulate positions that imply that the mind’s operation is beyond current science. At the same time, they believe that there is a way to see the mind as part of nature (that is,…

Trust Concept

#3 AI, We Are Now Told, Knows When It Shouldn’t Be Trusted!

Gödel's Second Incompleteness Theorem says that, for any system that can reliably tell you that things are true or false, it cannot tell you that it itself is reliable.

Okay, so, in #4, we learned that Elon Musk’s utterly self-driving car won’t be on the road any time soon. What about the AI that knows when it shouldn’t be trusted? (As if anyone does!) Our nerds here at the Walter Bradley Center have been discussing the top twelve AI hypes of the year. Our director Robert J. Marks, Eric Holloway and Jonathan Bartlett talk about overhyped AI ideas (from a year in which we saw major advances, along with inevitable hypes). From the AI Dirty Dozen 2020 Part III, here’s #3: AI that knows when it shouldn’t be trusted: https://episodes.castos.com/mindmatters/Mind-Matters-115-Jonathan-Bartlett-Eric-Holloway.mp3 Our story begins at 12:57. Here’s a partial transcript. Show Notes and Additional Resources follow, along with a link…