Mind Matters Natural and Artificial Intelligence News and Analysis
Young businesswoman thinking while using a laptop at work
Thinking woman at computer Adobe Stock licensed

Marks: Computers Only Compute and Thinking Needs More Than That

Robert J. Marks talks about his new book, Non-Computable You, with Oregon-based talk show host Bill Meyer
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Recently, Bill Meyer interviewed Walter Bradley Center director Robert J. Marks on his Oregon-based talk show about “Why computers will never understand what they are doing,” in connection with his new book, Non-Computable You: What You Do That Artificial Intelligence Never Will (Discovery Institute Press, 2022). We are rebroadcasting it with permission here as (Episode 194). Meyer began by saying, “I started reading a book over the weekend that I am going to continue to eagerly devour because it cut against some of my preconceived notions”:

A partial transcript, notes,  and Additional Resources follow.

Meyer and Marks began by discussion the recent flap at Google where software engineer Blake Lemoine claimed that the AI he was working with was sentient, like a human being. Google has dismissed this claim out of hand and put him on leave. Who’s right?

Robert J. Marks: Oh, my goodness. There are so many ways to push back on that claim and it’s hard to choose which one to go down. We can explore one of them if you’d like to, why that software is not sentient, why it doesn’t understand what it’s doing, for example.

Computers can add numbers,. like 12 and 13. But they don’t understand what the number 12 and number 13 is… I think in order to be sentient, you need to understand what you’re talking about. The argument goes back to a philosopher named John Searle who didn’t know Chinese.

The Chinese Room experiment:

Robert J. Marks: That’s exactly the same thing that’s happening with the Google robot. The software has looked at millions and millions of files, including, I would suppose, all of Wikipedia, plus some. They have done correlations, word relationships, and things of that sort. And so in the background, there’s a bunch of number crunching and that number crunching is going to spit out an answer. That answer is going to look like it means something … That computer has absolutely no idea why it responded. It has no understanding of what it did or what it’s saying.

Bill Meyer: Everything is algorithmical because everything is computational within the computer. Is that the short way of putting it?

Robert J. Marks: Yes. In fact, we have known that there are things that are non-computable way back since the 1930s. In the movie The Imitation Game, Alan Turing, the founder of computer science, was played by Benedict Cumberbatch. They’re the ones that cracked the Enigma code and helped win World War II.

But Alan Turing was also a mathematical genius. He was able to show back in the 1930s that there were things which are definitely not computable. Now this was not something which was conjecture. This was a mathematical fact.

Robert J. Marks: One of his first papers was on numbers which were non-computable. Then he went on to show some other things, the Turing halting problem. Since then, a number of different things which have been shown to be non-computable.

Now, if a computer can’t compute something, you have to ask the question, are there things that humans do that are also not computable? And the answer, which is talked about more deeply in the book is, yes.

(Non-Computable You (Discovery Institute Press,
2022) by Robert J. Marks is available here.)

Now there’s the obvious ones such as love, empathy, and compassion, anger. I don’t think that those will ever be duplicated in a computer. But even more important are the idea that we just talked about. Computers will never understand. They will never experience sentience, and they will never be creative. These are things which are brick walls that artificial intelligence will never go through. Now, artificial intelligence is doing incredible things. We certainly don’t want to diminish the accomplishment.

Bill Meyer: Certainly. I was hoping maybe you could touch on here briefly, if you could, Dr. Marks, the difference between artificial intelligence and artificial general intelligence? Because you do talk about this, AI and AGI…

Robert J. Marks: I think in terms of the media, artificial intelligence is anything that a computer can do which you look at and say, “Oh, gee whiz. That’s amazing.” That’s a good way to talk about it since the common denominator there is that everything is being done by a computer.

Robert J. Marks: Artificial general intelligence is the belief — and it’s actually a faith, there’s actually an AI church around this, believe it or not — that artificial intelligence will some way and someday duplicate everything that humans can do.

Now, if the premise that there are non-computable things that humans do [is correct], then this will never be achieved. I like to say that artificial intelligence is written in computer code like Python and C++ and all these other esoteric languages — AGI or artificial general intelligence — is mostly written in PowerPoint slides and news releases. We don’t see any indication that artificial general intelligence will ever happen. It’ll never understand. It’ll never be sentient. It will never be creative…

Bill Meyer

Bill Meyer: What do you believe, Dr. Marks, is the source of that non-computable side of humanity?

Robert J. Marks: Well, we’re getting above and beyond computer science and more into the area of philosophy… That’s the mind–brain problem. In terms of humans, the question is is the mind the same as the brain? This debate has been going on for years now. If one is a materialist and believes that everything can be described by natural laws and equations and things of this sort, you have no other place to go than artificial general intelligence. In other words, we’re all a bunch of meat computers. Yeah. Everything could be done algorithmically.

Bill Meyer: Are these the same people that think that you can literally take the human brain and upload everything about it into a computer?

Robert J. Marks: Yes. That is really curious because, since part of you is non-computable, the non-computable part of you will never be uploaded to a computer. So only the computable part of you is able to be uploaded to a computer. I tell you, just the computable you is pretty boring.

Bill Meyer: Just the computable you. I really like that. This is a fascinating book and it really got me thinking and also learning some words that I had never heard of before. I was hoping you could define one of them… Qualia. You say this is something that artificial intelligence just is not capable of.

Robert J. Marks: Well, qualia deals with the perceptions that you have from your senses. When you bite into a lemon, you have a certain taste. When you see the color red, you see a certain color. When you feel pain, there’s a certain experience that you had.

Books have been written about “red,” a classic example of qualia.

Let’s go through a thought experiment. If you look around your room, you can probably see something that’s red. And if you look at that redness for a second, you are experiencing something. You are experiencing red. Now, Bill, you and I can talk about red because we’ve both experienced red. But imagine explaining red, your experience, to a person that’s been blind since birth.

Bill Meyer: That’d be next to impossible to explain.

Robert J. Marks: You could explain the wavelength. You could say blood is red. You could give all sorts of examples, but duplicating that experience in the blind man through just talking to him is never going to happen. So how are you going to write an algorithm, a computer program to have a computer experience the qualia of red? You’re never going to be able to duplicate that in a computer.

Bill Meyer: Or the taste of a lemon…

Robert J. Marks: Yes. And a computer will only do what it’s programmed to do.

Bill Meyer: Okay. What do you think happens, then, as artificial intelligence increases in complexity to the point where it begins to program itself, which is already happening. I’m wondering if that is not a form of consciousness, ultimately?

Robert J. Marks: Well, there’s an assumption that artificial intelligence will be creative because it has to be creative to write something that wasn’t intended by its original programmer. Here, you have to go back to the definition. What does creative mean?

Creative, as defined by a guy named Selmer Bringsjord at Rensselaer, follows something called the Lovelace test. Does the computer program do something which is beyond the expectations or beyond the intent of the programmer?

Robert J. Marks

Now, this doesn’t mean you can’t be surprised. Computer programs surprise us all the time. You might get unexpected results but it can all go back to the input and the creativity of the computer programmer. If indeed the computer program is limited to the creativity of the programmer, it is never going to create artificial intelligence which is better than it is. To date, there has been no computer software that has passed the so-called Lovelace test of creativity. So AI writing better AI is never going to happen in accordance to the Lovelace test definition of creativity.

Bill Meyer: Is there a possibility though that, as computing power increases — in spite of the fact that it may not know what it’s doing as far as we’re concerned — we don’t know the difference? We can’t detect it.

Robert J. Marks: Well, I think the computers can simulate a lot. I don’t know if you’ve seen the movie A.I. But this little boy robot was just incredible, a humanoid form. And he was standing there and there was this love button that you pushed. The mother pushed the love button because she wanted a little boy. The little boy played by Haley Joel Osment, … just an incredible child actor. All of a sudden he came from totally emotionless into an idea of love, of snuggle hugging it. It was just amazing to watch that transition. But the fact that he did that, does that mean that he was experiencing love or was it all computing, which was being done under the hood?

Bill Meyer: Yes. When humans fall in love, is it a mathematical computation that we’re engaging in?

Robert J. Marks: And I maintain that’s non-computable. You can program a computer to say, “I love you.” Or you can write a computer program to show empathy, for example, but it doesn’t mean that it’s showing love. It doesn’t mean that it’s experiencing empathy.

Bill Meyer: Elon Musk, I think others like Henry Kissinger, Stephen Hawking, are big fans of artificial intelligence. But why do you think they’re wrong about this?

Note: Prominent twentieth-century political scientist and diplomat Henry Kissinger thinks that humans must change to adapt to AI. Cosmologist Stephen Hawking (1942–2018), Astronomer Royal Martin Rees, and self-driving care entrepreneur Elon Musk have all predicted artificial intelligences that can outdo humans.

Robert J. Marks: Well, the interesting thing is that a colleague of Stephen Hawking, Roger Penrose, who won the Nobel Prize last year in physics — just a brilliant, brilliant man — agrees with me. He agrees that there’s things within the human that are non-computable.

Pic
In fact, he wrote this wonderful book, which influenced me a lot. It was called The Emperor’s New Mind,, which outlines some of the thoughts that I’m talking about here.

But a lot of these people, including Elon Musk and Steven Hawking, come to this problem from a total materialistic point of view, which is that everything that exists can be explained by science. I think a sub-paragraph of that is that if that’s the case, then we are computers made out of meat and everything we do in our mind is computable. And I challenge that. I believe that Roger Penrose challenges that. The CEO of Microsoft, Satya Nadella, in his book and his biography challenges that. There’s a number of people that do challenge that idea that we are 100% computable.

So it comes from one’s ideology. If you are a firm materialist and you believe everything has to be described by mathematics and physics, well, then you’re inescapably in this idea of artificial general intelligence occurring eventually.

Bill Meyer: All right, Dr. Marks, aren’t we just dancing around the subject of the human soul or the spirit? Isn’t that really what we’re dancing around when we talk about what’s non-computable?

Robert J. Marks: Here we’re getting into theological topics, which I guess is fine. I am a Christian and I do believe that there is something external to the brain. I think that we have evidence of this. We’re starting to get evidence from neuroscience… out-of-body experiences and such, which are now being documented more than ever. There is something there beyond the brain. Indeed, this is something which goes back to René Descartes. So this is not a new thing, but it’s something which has been around for a long time. We’re starting to get evidence that indeed the mind is greater than the brain.


You may also wish to read:

Marks: Forget the hype, “thinking machines” can’t replace humans. It’s easy to picture, especially if we don’t know much about computers. And fears are easily exploited. But what are the facts? Computer engineer, Robert J. Marks, author of Non-Computable You (2022), discusses the limitations of computing in a just-released video

Additional Resources

Podcast Transcript Download


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Marks: Computers Only Compute and Thinking Needs More Than That