3. Does Mt Rushmore contain no more information than Mt Fuji?
That is, does intelligent intervention increase information? Is that intervention detectable by science methods?In Define information before you talk about it, neurosurgeon Michael Egnor interviewed engineering prof Robert J. Marks on the way information, not matter, shapes our world (October 28, 2021). In the first portion, Egnor and Marks discussed questions like: Why do two identical snowflakes seem more meaningful than one snowflake. Then they turned to the relationship between information and creativity. Is creativity a function of more information? Or is there more to it? Now, they ask, does human intervention make any difference? Does Mount Rushmore have no more information than Mount Fuji?
This portion begins at 24:22 min. A partial transcript and notes, Show Notes, and Additional Resources follow.
Michael Egnor: Dr. Jeffrey Shallit, a mathematician at the University of Waterloo near Toronto, claims that Mount Rushmore doesn’t have any more information than Mount Fuji. I’d like to ask my guest today Dr. Robert Marks to answer that question.
Robert J. Marks: In terms of meaningful information I think it’s obvious. Michael, they used to say that it doesn’t take a brain surgeon to answer this or it doesn’t take a rocket scientist. Well, it turns out you’re a brain surgeon. And I’ve done work for NASA. And I got a NASA tech brief award. I guess that makes me a rocket scientist. So I think for both of us, the answer is obvious. Mount Rushmore contains more information than does Mount Fuji.
There’s more meaningful information on Mount Rushmore. There’s Lincoln and Roosevelt and Washington. And yeah, what do we get with Mount Fuji? We just get a big chocolate gumdrop.
Michael Egnor: Can we say what type of information the additional information on Mount Rushmore is?
Robert J. Marks: Yeah, this is an interesting question. I’m going to give an explanation, then dovetail into the answer. We can ask ourself the definition of two DVDs, both of which have the same storage capacity. One has the movie Braveheart. One has just random noise. And both of them take out the same amount of bytes.
Robert J. Marks: Can we say that the DVD of Mel Gibson’s Braveheart has more information than the noise? Yes, absolutely. If you talk about meaningful information, and as we talked about before, it depends on your definition of information. Certainly in the case of Shannon information, or possibly Kolmogorov information yeah, they’re the same. But neither one of those measures meaning. And so one has to go to specified complexity, the mathematics of specified complexity, specifically algorithmic specified complexity.
And I’ll give a little pitch here, in case people want to read more about it. It’s in Chapter Seven, of the book that I co-authored with design theorist William Dembski and Winston Ewert called Introduction to Evolutionary Informatics.
And the cool part about the book is that it references a lot more nerdy papers that have been published in archival prestigious journals and conferences. So you can read it there at kind of a layperson’s level, or you can dig deeper and go into the papers.
Robert J. Marks: So I believe that Dr. Shallit was thinking about Shannon information in the sense that a DVD of Braveheart would contain the same information as a DVD of random noise. So if you took a picture of Mount Rushmore, and you took a picture of Mount Fuji, and you stored them on your camera, both of them might have the same file size, if you will. And in that sense, they are identical.
One of the problems that we talked about before is that people throw around the idea of information without really defining it. So I hope that by defining it that we’ve made this clear. And I think clearly in the context of the statement about Mount Rushmore containing more information than Mount Fuji, that we’re referring to meaningful information.
Michael Egnor: It’s kind of interesting that Dr. Shallit was saying that it wasn’t clear that Mount Rushmore had more information than Mount Fuji. Using his blog, where he types letters and words that other people read! And there’s no question that his blog contains more information than either a blank screen or just a screen with random typing.
So even the very effort that he makes to deny that Mount Rushmore has more information than Mount Fuji is itself an example of something that has more information than something analogous to Mount Fuji.
Robert J. Marks: So him making a statement is actually a self-refuting argument…
Michael Egnor: And if these guys didn’t have self-refuting arguments, they wouldn’t have any arguments at all. Because everything they say is self-refuting.
You’ve referred to specified complexity. And what is that?
Robert J. Marks: Well, it’s built on Kolmogorov. I’m going to get a little bit into the weeds here. Kolmogorov complexity is based on the shortest description length you can have of an object. The reason I really like Kolmogorov information theory is that it is the link to the physical idea of information. We know what mass is, we know what energy is, but what is information? What’s a physical link to information? I think that description length is a good example.
To illustrate, imagine that we have a three dimensional printer and we want to write a program. All three dimensional printers need programs in order to operate. We’re going to write one program that prints a bowling ball in three dimensions, then we’re going to write another program which generates a detailed bust of Abraham Lincoln, down to the detail of the wart on his right cheek and his shaved upper lip.
And the question is, if we had the two programs, we had the bowling ball, and we had Abraham Lincoln, which program is going to be the longer? It’s obviously the one of Abraham Lincoln, because, with a bowling ball, you said, print a sphere and put three holes in it.
But with Lincoln, you would have to specify his lips, and the beard and the mole and his eyebrows and everything else, and it would be a much longer program. So therefore, the bust of Lincoln has more complexity than the bowling ball. And this is what Kolmogorov complexity measures in terms of information…
Now, the interesting part is that if you wrote a program to do a three-dimensional bust of Lincoln and I wrote a program to do a three dimensional bust of Lincoln, one of our programs would be longer than the other one. So which one is the proper description length? Well, there must be a shortest program somewhere that generates the bust of Lincoln. The length of that shortest program is the Kolmogorov complexity of [the bust of] Lincoln. So this is Kolmogorov complexity, which is a component of specified complexity.
Next: Lincoln himself contained more information than a bust of Lincoln. But in information theory terms, how do we know?
Here are all the episodes in the series. Browse and enjoy:
- How information becomes everything, including life. Without the information that holds us together, we would just be dust floating around the room. As computer engineer Robert J. Marks explains, our DNA is fundamentally digital, not analog, in how it keeps us being what we are.
- Does creativity just mean Bigger Data? Or something else? Michael Egnor and Robert J. Marks look at claims that artificial intelligence can somehow be taught to be creative. The problem with getting AI to understand causation, as opposed to correlation, has led to many spurious correlations in data driven papers.
- Does Mt Rushmore contain no more information than Mt Fuji? That is, does intelligent intervention increase information? Is that intervention detectable by science methods? With 2 DVDs of the same storage capacity — one random noise and the other a film (BraveHeart, for example), how do we detect a difference?
- How do we know Lincoln contained more information than his bust? Life forms strive to be more of what they are. Grains of sand don’t. You need more information to strive than to just exist. Even bacteria, not intelligent in the sense we usually think of, strive. Grains of sand, the same size as bacteria, don’t. Life entails much more information.
- Why AI can’t really filter out “hate news.” As Robert J. Marks explains, the No Free Lunch theorem establishes that computer programs without bias are like ice cubes without cold. Marks and Egnor review worrying developments from large data harvesting algorithms — unexplainable, unknowable, and unaccountable — with underestimated risks.
- Can wholly random processes produce information? Can information result, without intention, from a series of accidents? Some have tried it with computers… Dr. Marks: We could measure in bits the amount of information that the programmer put into a computer program to get a (random) search process to succeed.
- How even random numbers show evidence of design Random number generators are actually pseudo-random number generators because they depend on designed algorithms. The only true randomness, Robert J. Marks explains, is quantum collapse. Claims for randomness in, say, evolution don’t withstand information theory scrutiny.
You may also wish to read:
Jeffrey Shallit, a computer scientist, doesn’t know how computers work. Patterns in computers only have meaning when they are caused by humans programming and using them. (Michael Egnor)
Show Notes
- 00:00:09 | Introducing Dr. Robert J. Marks
- 00:01:02 | What is information?
- 00:06:42 | Exact representations of data
- 00:08:22 | A system with minimal information
- 00:09:31 | Information in nature
- 00:10:46 | Comparing biological information and information in non-living things
- 00:11:32 | Creation of information
- 00:12:53 | Will artificial intelligence ever be creative?
- 00:17:40 | Correlation vs. causation
- 00:24:22 | Mount Rushmore vs. Mount Fuji
- 00:26:32 | Specified complexity
- 00:29:49 | How does a statue of Abraham Lincoln differ from Abraham Lincoln himself?
- 00:37:21 | Achieving goals
- 00:38:26 | Robots improving themselves
- 00:43:13 | Bias and concealment in artificial intelligence
- 00:44:42 | Mimetic contagion
- 00:50:14 | Dangers of artificial intelligence
- 00:54:01| The role of information in AI evolutionary computing
- 01:00:15| The Dead Man Syndrome
- 01:02:46 | Randomness requires information and intelligence
- 01:08:58 | Scientific critics of Intelligent Design
- 01:09:40 | The controversy between Darwinian theory and ID theory
- 01:15:07 | The Anthropic Principle
Additional Resources
- Robert J. Marks at Discovery.org
- Michael Egnor at Discovery.org
- Claude Shannon at Encyclopædia Britannica
- Andrey Kolmogorov at Wikipedia
- Spurious Correlations website
- Chapter 7 of: R.J. Marks II, W.A. Dembski, W. Ewert, Introduction to Evolutionary Informatics, (World Scientific, Singapore, 2017).
- Winston Ewert, William A. Dembski and Robert J. Marks II “Algorithmic Specified Complexity in the Game of Life,” IEEE Transactions on Systems, Man and Cybernetics: Systems, Volume 45, Issue 4, April 2015, pp. 584-594.
- Winston Ewert, William A. Dembski and Robert J. Marks II “On the Improbability of Algorithmically Specified Complexity,” Proceedings of the 2013 IEEE 45th Southeastern Symposium on Systems Theory (SSST), March 11, 2013, pp. 68-70
- Winston Ewert, William A. Dembski, Robert J. Marks II “Measuring meaningful information in images: algorithmic specified complexity,” IET Computer Vision, 2015, Vol. 9, #6, pp. 884-894