How Information Becomes Everything, Including Life
Without the information that holds us together, we would just be dust floating around the roomIn Define information before you talk about it, neurosurgeon Michael Egnor interviewed engineering prof Robert J. Marks on the way information, not matter, shapes our world (October 28, 2021). In the first portion, Egnor and Marks discuss questions like: Why do two identical snowflakes seem more meaningful than one snowflake?
This portion begins at 01:02 min. A partial transcript and notes, Show Notes, and Additional Resources follow.
Michael Egnor: I know that information is a topic that you have a strong professional interest and a great deal of professional expertise. Probably the best way to start is to ask what is information?
Robert J. Marks: It turns out that before talking about information, you really have to define it. For example, if I burn a book to ashes, and scatter the ashes around, have I destroyed information?
Does it make any difference if there’s another copy of the book. If I take a picture — everybody knows the pictures on your cell phones require so many megabytes of storage — am I creating information? The answers depend on your definition of information.
If I’m given a page that has Japanese text on it, and I don’t read Japanese, does it have less information to me than it does to somebody that has a native reading ability of Japanese?
Claude Shannon recognized that there was different definitions, in the mathematical sense, of information. And he said, “It seems to me that we all define information as we choose. And depending on what field we are working in, we will choose different definitions. My own model of information theory was framed precisely to work with the problem of communication.”
Robert J. Marks: Claude Shannon, of course, was the guy that wrote the 1948 paper, which defined information. It was the first paper to use the word bit, a contraction for binary digit. And he laid the foundation for the communication networks that we use today on our cell phone. He was an extraordinary man. And the work that he did in founding Shannon information has had more profound effects on our lives than say, for example, the works of Einstein. He was just a really brilliant guy that worked for Bell Labs.
Robert J. Marks: So to answer your question, what is the definition of information, one has to go to the different models of information that there are.
There is the Shannon model of information. Shannon modeled information as bits and was interested in how we use information to communicate.
There’s also something called Kolmogorov information. Andrey Kolmogorov (1903–1987) was a Russian scientist. His [approach to] information was founded by three people Kolmogorov, Chaitin, and Solomonoff. Kolmogorov got his name associated with it because of the Matthew principle: He who has shall be given to — because Kolmogorov at the time was the most famous of these mathematicians. It has to do with structure and the description length of an object.
Then there’s the physical definition. A great physicist, Rolf Landauer, said all information is physical. Now I would argue with that. I think that all information is physical if you constrain yourself to the physics definition. And this is in total contrast to the founder of cybernetics, Norbert Wiener (1894-1964) , one of the great names. He is famous for saying information is information. It’s neither matter nor energy.
We can think of, for example, information written in a book, on the printed page. That’s information etched on matter.
But we also know that information can be etched on energy in terms of the cell phone signals that you receive. And so it’s not matter nor energy, but both can be places where you can place and represent information.
And then there’s the fourth type of information (the three so far are Shannon information, Kolmogorov information, and physical information). The fourth is specified complexity — specifically algorithmic specified complexity.
The models of information that I have just shared with, you don’t really measure meaning. The purpose of specified complexity — specifically, the mathematics of algorithmic specified complexity — is a way to measure the meaning in bits of an object…
Note: Specified complexity: “A single letter of the alphabet is specified without being complex (i.e., it conforms to an independently given pattern but is simple). A long sequence of random letters is complex without being specified (i.e., it requires a complicated instruction-set to characterize but conforms to no independently given pattern). A Shakespearean sonnet is both complex and specified.” – William Dembski, Metanexus
Michael Egnor: If you consider the information that is put into a system by an intelligent agency, is that one of those particular kinds of information? Or could it be any kind of information?
Robert J. Marks: It does turn out that one of the one of the challenges with naturalistic processes is that they cannot be creative. And therefore, if you see the act which is creative requires external information to be infused in the process in order to guide whatever is being designed to its final design.
For example, in biology information is there everywhere. Shannon information is there. Kolmogorov information is there, because the description of the human body would take volumes to write down. Specified complexity, algorithmic specified, complexity is also there. Because the body and what it does has a lot of meanings.
Why biology uses digital, not analog information
Robert J. Marks: So I think in all cases, the relevance to biology is really significant… Shannon showed that if you had a digital representation, then you could communicate exactly. [By contrast] Continuous or analog computing or processes degrade. If you took a photocopy of a picture of your mother and took a photocopy of a photocopy and a photocopy of the photocopy of the photocopy, et cetera. In about 10 or 12 generations, your photocopy would look nothing like the original copy.
One of the beautiful things about all of creation is that it uses DNA, which is digital. And we know we can take a digital image of your mother and send it to your wife who sends it to her sister who sends it to her son. And each one of these pictures is exactly the same, there is no degradation. So there is a beauty in the fact that our reproduction is guided by a digital process. This is the reason for example, we no longer use VHS tape, which was an analog or a continuous process. Why we went to DVDs and ultimately digital streaming. And it’s the same reason we don’t longer use cassettes, we went to CDs and ultimately to streaming music on Spotify, which is all done digitally. So it’s a wonderful testimony to biological design that we are fundamentally digital in terms of our reproduction.
Michael Egnor: Wow. Sometimes when it’s difficult to define or understand a concept, it’s helpful to imagine its absence. What would characterize a system that had minimal information?
Robert J. Marks: Well, this dates back to the 18th century mathematician Jacob Bernoulli, who came up with the idea of Bernoulli’s principle of insufficient reason. His concept was that if we have no a priori information about anything, the best that we can do is assume that everything is going to be uniformly distributed. It’s a very useful principle. But in terms of absent information, if everything that followed Bernoulli principle, we would just be dust, kind of spread around.
Note: “Jakob Bernoulli’s pioneering work Ars Conjectandi (published posthumously, 1713; “The Art of Conjecturing”) contained many of his finest concepts: his theory of permutations and combinations; the so-called Bernoulli numbers, by which he derived the exponential series; his treatment of mathematical and moral predictability; and the subject of probability—containing what is now called the Bernoulli law of large numbers, basic to all modern sampling theory. His works were published as Opera Jacobi Bernoulli, 2 vol. (1744).” – Britannica
Robert J. Marks: Mathematically, Bernoulli’s principle is exactly the same as maximum entropy. And we know that maximum entropy is what spreads the gas around the room. So if there was no organization, if there was no informational structure, that’s where we would go, according to Bernoulli and to thermodynamics.
Michael Egnor: What fascinates me I believe, is not only that living things quite obviously containing enormous amounts of information but even just the ordinary laws of nature, manifest information and manifest an intelligent cause. That I think you can see information in snowflakes and the laws of physics, all sorts of things.
Robert J. Marks: Yes, exactly. In fact, one of the interesting things is that things with high Shannon, or even Kolmogorov information happen all the time. We know, for example, that the generation of a single snowflake, as you mentioned, requires a lot of information in terms of either bits or the description length of Kolmogorov…
I think a more interesting question is, What is the meaning of two identical snowflakes? Then all of a sudden, you get into the idea of meaning — two identical snowflakes has a greater meaning than a single observation of a single snowflake.
Next: How does information relate to creativity?
Here are all the episodes in the series. Browse and enjoy:
- How information becomes everything, including life. Without the information that holds us together, we would just be dust floating around the room. As computer engineer Robert J. Marks explains, our DNA is fundamentally digital, not analog, in how it keeps us being what we are.
- Does creativity just mean Bigger Data? Or something else? Michael Egnor and Robert J. Marks look at claims that artificial intelligence can somehow be taught to be creative. The problem with getting AI to understand causation, as opposed to correlation, has led to many spurious correlations in data driven papers.
- Does Mt Rushmore contain no more information than Mt Fuji? That is, does intelligent intervention increase information? Is that intervention detectable by science methods? With 2 DVDs of the same storage capacity — one random noise and the other a film (BraveHeart, for example), how do we detect a difference?
- How do we know Lincoln contained more information than his bust? Life forms strive to be more of what they are. Grains of sand don’t. You need more information to strive than to just exist. Even bacteria, not intelligent in the sense we usually think of, strive. Grains of sand, the same size as bacteria, don’t. Life entails much more information.
- Why AI can’t really filter out “hate news.” As Robert J. Marks explains, the No Free Lunch theorem establishes that computer programs without bias are like ice cubes without cold. Marks and Egnor review worrying developments from large data harvesting algorithms — unexplainable, unknowable, and unaccountable — with underestimated risks.
- Can wholly random processes produce information? Can information result, without intention, from a series of accidents? Some have tried it with computers… Dr. Marks: We could measure in bits the amount of information that the programmer put into a computer program to get a (random) search process to succeed.
- How even random numbers show evidence of design Random number generators are actually pseudo-random number generators because they depend on designed algorithms. The only true randomness, Robert J. Marks explains, is quantum collapse. Claims for randomness in, say, evolution don’t withstand information theory scrutiny.
You may also wish to read: How information realism subverts materialism Within informational realism, what defines things is their capacity for communicating or exchanging information with other things.
Show Notes
- 00:00:09 | Introducing Dr. Robert J. Marks
- 00:01:02 | What is information?
- 00:06:42 | Exact representations of data
- 00:08:22 | A system with minimal information
- 00:09:31 | Information in nature
- 00:10:46 | Comparing biological information and information in non-living things
- 00:11:32 | Creation of information
- 00:12:53 | Will artificial intelligence ever be creative?
- 00:17:40 | Correlation vs. causation
- 00:24:22 | Mount Rushmore vs. Mount Fuji
- 00:26:32 | Specified complexity
- 00:29:49 | How does a statue of Abraham Lincoln differ from Abraham Lincoln himself?
- 00:37:21 | Achieving goals
- 00:38:26 | Robots improving themselves
- 00:43:13 | Bias and concealment in artificial intelligence
- 00:44:42 | Mimetic contagion
- 00:50:14 | Dangers of artificial intelligence
- 00:54:01| The role of information in AI evolutionary computing
- 01:00:15| The Dead Man Syndrome
- 01:02:46 | Randomness requires information and intelligence
- 01:08:58 | Scientific critics of Intelligent Design
- 01:09:40 | The controversy between Darwinian theory and ID theory
- 01:15:07 | The Anthropic Principle
Additional Resources
- Robert J. Marks at Discovery.org
- Michael Egnor at Discovery.org
- Claude Shannon at Encyclopædia Britannica
- Andrey Kolmogorov at Wikipedia
- Spurious Correlations website
- Chapter 7 of: R.J. Marks II, W.A. Dembski, W. Ewert, Introduction to Evolutionary Informatics, (World Scientific, Singapore, 2017).
- Winston Ewert, William A. Dembski and Robert J. Marks II “Algorithmic Specified Complexity in the Game of Life,” IEEE Transactions on Systems, Man and Cybernetics: Systems, Volume 45, Issue 4, April 2015, pp. 584-594.
- Winston Ewert, William A. Dembski and Robert J. Marks II “On the Improbability of Algorithmically Specified Complexity,” Proceedings of the 2013 IEEE 45th Southeastern Symposium on Systems Theory (SSST), March 11, 2013, pp. 68-70
- Winston Ewert, William A. Dembski, Robert J. Marks II “Measuring meaningful information in images: algorithmic specified complexity,” IET Computer Vision, 2015, Vol. 9, #6, pp. 884-894