Entropy and Information

Recently I had a long discussion on the everything list about entropy and information where I have defended a view that the thermodynamic entropy and information are not related. The summary of the discussion is in my blog

http://blog.rudnyi.ru/2012/02/entropy-and-information.html

Right now there is a follow up discussion on the embryophysics list where the main topic has been shifted to information in biology:

http://groups.google.com/group/embryophysics/t/419d3c1fec30e3b5

We had also a private discussion with Vasily and it is published below.

14.02.2012 18:47 Vasily Ogryzko:

I looked briefly at the discussion. Could the problem be that if we consider the notion of entropy purely thermodynamically, regardless of its statistico-mechanical explanation/meaning/understanding, then indeed there is no information concept needed. However, as soon as we try to justify and explain it via the statistico-mechanical concepts, then the notion of probability comes into play, and then the idea of information, as something that can affect our ideas of probability of a macrostate, could become relevant.

16.02.2012 07:24 Evgenii Rudnyi:

I find such a logic a bit strange. In my view the best to consider this problem from a viewpoint of experimental science. Otherwise it seems that either we abuse the language, or the language abuses us.

So, if to look at this problem as something that could be measured, then we have information in informatics (measured in Bytes) and the thermodynamic entropy. One can convert formally the latter to Bytes but then the numerical values will be still completely different. Hence, I am not sure if I understand your point.

16.02.2012 12:39 Vasily Ogryzko:

I am not sure that there is only one way to measure information. I think that we are at the stage of understanding the concept of information, that is somewhat similar to the early days of classical mechanics – when people tried to measure the ‘amount of mechanical motion’ and eventually came up with two different ways to do it: ‘kinetic energy’ and ‘impulse’.

So the point might be that we are still struggling with the notion of ‘information’ and its different aspects, and still looking for different more rigorous and better definable concepts related to it. And eventually we will have to accept that there might be different ways to use this notion and measure ‘information content’.

In this respect, physics is moving now towards nano-scale, where not all usual concepts of physics will apply. This will require more conceptual work and development of new concepts, and trying to understand how they are related to each other. From this point of view, I think that it is very useful that there is a connection between the notion of entropy and information. In other words, you might be right, and at the moment, as long as we are dealing with ‘classical’ experimental systems, we do not need to know about a possibility to relate entropy and information; But when we move to nano-scale, this might prove very helpful. Maxwell’s demon is an example, I think.  

16.02.2012 18:44 Evgenii Rudnyi:

Two questions.

1) Has a theory that related the thermodynamic entropy and information should be developed or it has been already developed?

2) Does this information related to the thermodynamic entropy has something to do with information in informatics? Should then the thermodynamic entropy explain the numerical values of information in informatics?

I personally believe that something goes wrong with definitions. I would expect that a definition for a physical values should be such, that one could make measurements. Yet, with information this seems does not to work. Interestingly enough, no one cares about this.

I have never understood a problem with the Maxwell’s demon. Why it is not enough to say that it does not exist? Why for example Maxwell’s demon touches your imagination and the idea of the God not?

17.02.2012 10:58 Vasily Ogryzko:

I am not sure that I will give you satisfactory answers:

I think that the answer to your first question depends on the level of rigor that you expect:-)

To answer to your second question, I think that we are at the very early stages of understanding the concept of ‘information’ and its role in physics. But anyway I feel that there is a connection there, worthy of exploration. And biology might be very useful here, because in a sense, most of what molecular biology is studying is information processing. And this is information processing at molecular level, so that the issues of thermal noise become important.

In the same vein, I think the problem of Maxwell’s demon will become useful in what I called ‘systems nano-biology’ in my Second Life seminar. The idea here is that we might not only consider ourselves (scientists, humans) as observers, but also the living things (e.g., cells) could be considered as observers. And most interestingly, given the difference in recognition capabilities of different observers, the notion of relevant information would be different for different observers. And this should be taken into account in our generalization of thermodynamics. So, when science will be ready for tackling nano-scale, the notion of Maxwell’s demon could become very useful.

17.02.2012 13:58 Dr. Richard Gordon:
I’ve never actually found a use for the concept of information in my work, so have not delved deeply into it. Two publications you might want to look into are:

Frieden, B.R. (1999). Physics From Fisher Information; A Unification Cambridge, Cambridge University Press.

Grössing, G. (2000). Quantum Cybernetics, Towards a Unification of Relativity and Quantum Theory via Circularly Causal Modeling. New York, Springer Verlag.

At the nanoscale, concepts in thermodynamics are altered. See:

Hill, T.L. (1994). Thermodynamics of Small Systems. New York, Dover.

by my PhD mentor, Terrell Hill.


Comments

3 responses to “Entropy and Information”

Comments are now closed
  1. Guy Lukes says:

    You should look at Terrence W. Deacon. His latest book clarifies these issues in the context of work and constraint.

    See: Incomplete Nature: How Mind Emerged from Matter

  2. Thanks for the link. I have found a review in The Wall Street Journal

    http://online.wsj.com/article/SB10001424052970204618704576642991109496396.html

    Do you know how Terrence W. Deacon defines what information is?

  3. I have found his definition of information

    http://www.teleodynamics.com/wp-content/PDF/ShannonBoltzmannDarwinPt1.pdf

    I guess that Terrence W. Deacon has missed a class in experimental thermodynamics. If you like his book, please try to find information in my examples from thermodynamics

    http://blog.rudnyi.ru/2012/02/entropy-and-information.html

    It is easy to say that the thermodynamic entropy is information. It is not that easy though to find information in typical thermodynamics problems.