Mind Matters Natural and Artificial Intelligence News and Analysis
this-is-exactly-what-heaven-looks-like-looking-over-the-top-of-a-blanket-of-gorgeous-pastel-coloured-fluffy-clouds-depicting-heavenly-lansdcape-background-ideal-for-a-spiritual-theme-stockpack-adobe-stock
This is exactly what heaven looks like - looking over the top of a blanket of gorgeous pastel coloured fluffy clouds depicting heavenly lansdcape background ideal for a spiritual theme
Image licensed via Adobe Stock

Could Our Minds Be Bigger Than Even a Multiverse?

The relationship between information, entropy, and probability suggests startling possibilities. If you find the math hard, a face-in-the-clouds illustration works too
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

My last article explaining how our minds are bigger than the universe might leave a reader with many questions, such as:

● Will I need a bigger hat?
● Does the universe revolve around me?
● How will I fit through the door?

I won’t be covering any of those questions today.

Instead, I’ll answer another question that was posed: What is the relationship between information, entropy, and probability?

It’s a very good and important question because there is a deep relationship between all three of these concepts: information, entropy, and probability. Understanding the relationship helps us understand why our ability to write has anything to do with the size of our mind.

First, let’s get some definitions out of the way.

Probability is a concept that most of us are familiar with from school. It tells us that if a certain event occurs N times, what portion of those occurrences can be expected to happen in a certain way. For instance, if I roll a six-sided dice 60 times, then we expect each side to show up, on average, about 10 times.

Entropy is calculated from probability. We first convert all probable outcomes of events by taking the logarithm. We then multiply the logarithms by those same probabilities to get the expected logarithm. At this point, the connection to information becomes clearer.

By taking the logarithm, we convert the probabilities into bits. Thus, we can calculate that for N bits of 1s and 0s, a specific sequence of bits has a 1 in 2^N chance of occurring. If we want to generate that sequence randomly, we must generate 2^N bits in order to expect the sequence to occur once.

But entropy is not quite the same thing as what we call information. This article has exactly the same entropy as gibberish letters of the same length, yet the article contains information and gibberish letters do not.

What sets the letters in the article apart from gibberish is that the letters correspond to something. These sentences you are reading are related to meaning in your mind, unlike the gibberish letters.

The correspondence between the letters and your mind is called mutual information. It is calculated similarly to entropy, using the expected logarithm of two probability distributions instead of one. The upper limit of mutual information between the two distributions is the distribution with the smallest amount of entropy. Similar to entropy, the mutual information represents a probability. For N bits of mutual information there is a 1 in 2^N chance of the two distributions generating the exact same sequence, if they were not related. Let that sink in for a moment.

This means that if N is very big, then the two distributions are not related by chance or accident, but by purpose.

Faces in the Clouds vs. Numbers in the Clouds

If math is not your best subject, think about it this way. Imagine seeing a face in the clouds. Happens all the time. We don’t think anyone purposefully put a face up there. The clouds randomly form faces frequently because the pattern that registers as a face is very easy for the wind to create. So, there is a very small amount of mutual information between the clouds and what we consider a face. The probability 1 out of 2^N is quite large.

Now imagine someone wrote the winning lottery numbers in the clouds, before they are drawn. The mutual information between the clouds and the winning numbers is enormous, and 1 out of 2^N is practically impossible. So we infer that the writing — however it got there — is purposeful.

cloud numbers in blue sky

Suppose we want to assume, conversely, that the clouds generated the winning lottery numbers randomly. We need randomly formed clouds 2^N times in order to expect the numbers to be generated once. Clouds take up space, and forming that many clouds 2^N takes up a lot of space!

Mutual information is the basis of electronic communication. We can reliably transmit information from point A to point B because clever algorithms ensure that there is a large amount of mutual information between the two ends.

At this point, the relationship to my earlier observation that your mind is bigger than the universe, in information terms, becomes clear. When we write out text, there is a lot of mutual information involved. To have generated the text randomly requires 2^N characters generated, and so for modest values of N, such as 1000, we quickly reach the limit of how many characters can be generated within the confines of our universe. Yet we humans write gobs and gobs of text, which means that N is a very large number. Thus the number of trials needed to generate the text randomly must be exponentially large, larger than the size of our universe can provide.

But what if…

Yet there is an objection to this line of thought. In our analysis, each trial is assumed to be independent from all the others. No roll of depends on earlier ones. However, if each trial is dependent on the previous one instead, the calculation changes. Once trials become dependent, the entropy decreases, and the probability of occurrence increases.

This seems like a way out of the idea of purpose in the universe. We can appeal to the idea that events in the universe are all based on what came before. Life is not a random dice roll, but the result of many previous occurrences. That assumption dramatically lowers the entropy. Consequently, the text we generate not only becomes possible but quite probable.

However, dependence is not a get-out-of-jail-free card. Lowering entropy means that events in general become more probable. But that doesn’t mean that the event of interest becomes more probable. In fact, lowering entropy can have the opposite effect of making the event of interest impossible.

So, dependence only helps if it skews things towards the event of interest. To make that happens requires an even greater infusion of information than was required originally. That implies that our minds are not just bigger than the universe, but also bigger than all the multiverses put together. Now that’s definitely going to require a bigger hat!

You may also wish to read: Is your mind bigger than the universe? Well, look at it this way… Surprisingly, there is a way to measure the mind that shows it IS bigger than the universe — information. To generate a modest piece of text like Lincoln’s Gettysburg Address, the storage capacity exceeds that of the entire history of the physical multiverse.


Eric Holloway

Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Eric Holloway is a Senior Fellow with the Walter Bradley Center for Natural & Artificial Intelligence, and holds a PhD in Electrical & Computer Engineering from Baylor University. A Captain in the United States Air Force, he served in the US and Afghanistan. He is the co-editor of Naturalism and Its Alternatives in Scientific Methodologies.

Could Our Minds Be Bigger Than Even a Multiverse?