Mind Matters Natural and Artificial Intelligence News and Analysis
toothpick-stockpack-adobe-stock.jpg
toothpick

They Say This Is An Information Economy. So What Is Information?

How, exactly, is an article in the news different from a random string of letters and punctuation marks?
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

We know information when we see it. An article contains information. A photograph contains information. The thoughts in our mind contain information. So does a computer program and so do our genomes.

Yet other things we see around us clearly do not contain information. A handful of toothpicks dropped on the ground does not. Nor do the swirling tea leaves in a cup. Neither does a pair of tossed dice nor a sequence of 100 coin flips. But mere disorder is not the clue. An intricate snowflake does not contain information either.

Can we state the difference between the article and the scattered toothpicks precisely? That’s tricky. Both Claude Shannon and Andrey Kolmogorov came up with information metrics. But the fundamental problem with both Shannon information and Kolmogorov complexity is that they equate information with complexity.

This approach makes sense at first because very simple things do not contain information. The regularity of crystals does not contain information. A streak of water down a windowpane does not contain information. The number 1 does not contain information.

Clearly, complexity is a necessary feature of an entity that contains information—but it is not sufficient. The exact disposition of the dropped toothpicks may be very complex but the complexity is not information. What is the missing ingredient that separates the class of uninformative things from the class of informative things?

The answer is staring us right in the face. Let’s look at the word “information” and break it into pieces: in-form-ation. The word is itself telling us the nature of information. It is saying that a thing is informative when it has been formed.

What does it mean to be formed? Let’s think about things that are formed. We can form a lump of clay. We can carve a piece of wood, giving it form. We can cut a piece of paper into a form. In each of these cases, we take raw matter and shape it according to an external pattern. This must be what information means: a raw medium has been shaped by an external pattern.

Let us return to our original two sets of informative and uninformative things and apply our criterion. Consider the article, an information piece that appears in a medium. The raw matter of an article is letters and punctuation. If we distribute letters and punctuation randomly on a page, without applying an external pattern, then we get something that is without pattern and uninformative. On the other hand, if we take our letters and punctuation, and arrange them so as to express our thoughts (an external, specified pattern), suddenly the arrangement becomes informative to a reader. It looks as though our criterion works.

Let’s try again. Consider the toothpicks. If we drop them on the ground and let them scatter randomly, they will have no shape and will thus be uninformative. On the other hand, we can carefully place the toothpicks to spell out a message, which is an external, specified pattern, and then the toothpicks become informative. Success again!

So we can now begin to understand what information is. When we have some sort of raw material which is arranged according to an external pattern, we can produce information. So an information economy does not depend so much on accumulating raw material as on shaping it into information.


You may also enjoy these articles on information by Eric Holloway:

How can we measure meaningful information? Neither randomness nor order alone creates meaning. So how can we identify communications?

and

Does information theory support design in nature? William Dembski makes a convincing case, using accepted information theory principles relevant to computer science.


Eric Holloway

Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Eric Holloway is a Senior Fellow with the Walter Bradley Center for Natural & Artificial Intelligence, and holds a PhD in Electrical & Computer Engineering from Baylor University. A Captain in the United States Air Force, he served in the US and Afghanistan. He is the co-editor of Naturalism and Its Alternatives in Scientific Methodologies.

They Say This Is An Information Economy. So What Is Information?