Mind Matters Natural and Artificial Intelligence News and Analysis
website-designer-working-digital-tablet-and-computer-laptop-with-stockpack-adobe-stock
Website designer working digital tablet and computer laptop with
Website designer working digital tablet and computer laptop with

Is Information Physical? It Depends On What You Mean by Physical…

Information makes things happen but, curiously, it erases its own history
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

University of Pittsburgh physics prof David Snoke has thought a lot about the relationship between information and physical reality. For example, why does a zip drive full of critical information — that will cause many changes when people read it — weigh only as much as an empty one?

Here’s an excerpt from a lecture he gave (podcast) in 2015 on whether information is physical. Although the talk was intended for a group of scientists, it is lay-friendly and enjoyable:

People say, “Well, information is not a real thing,” or “It’s only between humans” or something like that. That’s not the way physicists typically talk. So I want to connect you to some of the work that’s been done over several decades in thermodynamics.

David Snoke

I realize that thermodynamics is probably not the most exciting course that you ever took. It tends to be taught in a tedious way, but actually, it’s had some of the most hot controversies. Boltzmann, one of the founders of the field, committed suicide because his beliefs were not accepted.

It has to do with enormous things that relate to us all such as why is there even time? Why do we have an experience of time? Why do we have an experience of memory? And so on.

So let me start with this question: Is information a physical thing? That gets to something that in physics we call extensive quantities. A lot of times, people have great difficulty with information. They think of it like “Is it a thing I can hold in my hand, like an apple or a rock?” “Well, no, then it must not be a physical thing. It must be in the spirit world.” That’s not quite the way that we work in physics.

In general, extensive quantities can only be defined for a large ensemble. They, in some sense, have no meaning for small ensembles. We really define them only for a large-scale thing. So “heat” is one. I’ll come back to that. “Entropy” is another — free energy. These are all things that are defined for a whole system.

So in some sense, you could say that they’re relational, that they’re defined only for the system and the collective property of the whole. It’s really a description of the relationship of the parts, which can certainly be a real physical thing.

Another thing about them is that they’re proportional to the size of the system. You can have more or less heat. You can flow heat from here to there, and so on. The same is true of entropy and free energy.

Entropy: Houses do not tidy themselves

One that you’re probably familiar with is heat. Heat is mathematically defined this way: the sum of the motions of the particles relative to the center of mass. Notice the relational nature of it. It’s related to what all the other particles are doing.

But the thing is, even though we might define the mathematics this way, heat is a real thing, right? You have heat sensors in your fingers. You can detect heat; you know that it’s real. So even though we might say it’s only a property of a large number of particles, it’s still very real and tangible. It’s something that you can feel, literally, with your fingers.

mess, disorder and interior concept - view of messy home living room with scattered stuff
On its own, it could only get messier…

Now, entropy in physics is thought of just the same way. Entropy you can define mathematically as the log of the total number of equivalent states. That is going to be proportional to the size of the system because the bigger the system, the more states you have, in general. Equivalent is defined only macroscopically (for the whole system)…

In general, though, entropy is not conservative. It can get larger. A lot of people have difficulty visualizing what entropy is. Entropy can be thought of as a disorder. It’s a measure of the disorder in the system. We have the second law of thermodynamics, disorder always increases. You can say your house gets spontaneously dirty. You don’t need to work to get it dirty, but you need to work to get it clean. So things do not spontaneously get more ordered. The reason why… So mathematically it defines the number of equivalent states.

The reason why we associate with that disorder is because in general, things that are disorderly can be arranged and they remain disorderly. So if I have a junk pile, I could rearrange the junk pile — and I could rearrange it in many ways. It would still be a junk pile. Whereas if I took your brain and rearranged it in many ways, it would no longer be a working brain, right? So things that are in a large number of equivalent states are assumed to be very disorderly.

Actually, I’m not going to talk about this at all, but you could give a whole talk in the arrow of time. This fact that we have the second law implies that we have an experience of time going forward. If I play a movie in which I drop an egg on the carpet, then it breaks, and then I showed you that movie, you could tell which way was the arrow of time of that movie. Because eggs don’t spontaneously gather together and jump off of the carpet. So you know which way time is going.

Reversing film can appear to do what nature does not do, left to itself (reverse entropy).

The importance of contingency to information

Now let’s talk about information. A lot of stuff has been done on information and entropy connections going way back to the early 20th century. There’s a natural connection of entropy and information. Shannon information, can be defined as the log of the number of possible states that the existing state was chosen from. That looks a lot like our definition of entropy, right? Our entropy definition was the log of the number of equivalent states. Here we’ve got a log of a number of possible states. So there’s a similarity, but they’re not identical. They’re not the same thing. And it’s not adequate to simply say that information is the negative of entropy, we have to do a little more work. Here’s an example of this:

We’ve heard about bits, of course. If I have a memory bank with eight bits, it has 256 possible states. If there’s nothing demanding them to be in one state or the other, then they’re all equivalent. We don’t care which state it’s in. So we would say there’s 256 possibilities. If I take the log base two, then I would get the number eight. And that would be my Shannon information. It has eight bits of information.

Here’s a point that I want to stress: In order for something to be a good information holding system, it has to have what’s called contingency, which is to say that no force determines one state over another.

child hands showing a colorful 123 numbers agains wooden table. Concept of Child education, learning mathematics and counting
Information is created by reducing the possibilities.

If there’s only one possible state, then there’s no information value in that. But if I say this is one of 256 possibilities, you could view information as a reduction of possibilities. You’re saying that we started with 256 possibilities, and now we’re picking just one of them. It would be valid to say I chose it from those if it was possible to actually have all of them. If physics demanded that it be just one of them, there would be no information quantity. You’d say, “I just know what’s going to happen.” There’s no way around it.

So my first point here is that information, like entropy, is a physical thing. It’s an extensive property…

As I said, good information systems look like this. You’ve got a lot of equivalent positions. You’ve got all kinds of places to store bits. This is a nanoscale of what’s inside of the memory chips in your computers. This is (also) DNA. DNA makes a good information store because if I was to change it to a different code, it wouldn’t be energetically any more costly. I can have any number of codes, represented without energy cost between them.

Information as erasing its own history

Essentially by definition, good information-carrying systems are intrinsically history-erasing systems. Let me expand on this. Deterministic processes favor one state over another and remove contingency. Deterministic processes say only this outcome is possible, and the other ones are not. So a deterministic process forces things into a certain state.

[Dr. Snoke goes on to talk about the difficulty that information systems create when we infer design in a system. Information erases its history so if you don’t know the history, you can only infer the design. But that doesn’t show that there is no design.]

For example, if you are given a book and you read the book, you will not be able to deduce what order the chapters are written in. You might assume the author wrote them in order. But they might not have. They might have started out with a really good idea for Chapter 5, then gone back and said, “I need to write Chapter 1 to introduce this.” I write papers like this all the time. I write the conclusion first, then I go back and write other parts and so on. Or I’ll describe the figures, then I’ll generate the figures.

There is no way to deduce from a set of information, how it was generated, almost by definition, because the information-carrying medium erases all history. Because it’s non-deterministic. Because it allows for contingency. So the very fact that allows many states as possibilities means that it doesn’t carry a history of how that was generated. In general, it’s not something you get just from the physical aspects of the novel itself.


You may also wish to read: Information theory: Evolution as the transfer of information Information follows different rules from matter and energy, which might change the way we see evolution. A pair of researchers have introduced an Information Continuum Model of Evolution (ICM) which takes into account that information is immaterial.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Is Information Physical? It Depends On What You Mean by Physical…