[net.physics] information and entropy

lew (10/09/82)

If we imagine a bit stream embodied as a set of physical systems,
each with two states separated by some energy difference, the
informational and physical entropies have essentially the same
definition; basically the "log of the available states". The
application of the concept is quite different in each case though.

For the physical entropy to be meaningful. The systems must be in
thermal equilibrium. This means that the ensemble of systems
(called the microcanonical ensemble in this approach) must have an
equal probability of being in any one of the available states. This
is a fundamental assumption, which lies close behind all the conclusions
of statistical physics.

For the informational entropy to be meaningful, the assumption is
implicit that some agency has the freedom to choose in which state to
place each system. Each set of states is a message. The receiver of
the message has an equal a priori expectation of receiving any one of
them, so the "available states" are interpreted as "possible messages".

IT IS MEANINGLESS TO SPEAK OF THE ENTROPY OF A PARTICULAR MESSAGE.

I feel that this mistake is made by Lila Gatlin in "Information Theory
and the Living System". I recommend reading it anyway, I'm having quite
time trying to decide if the author is all wet or if I'm just
being thick (a distinct possibility.) Maybe someone can help me on this.

Some other thoughts:

Sometimes the entropy-information connection is formed through Maxwell's
demon, who operates a tiny shutter letting faster than average atoms
of a cold gas into a hot gas and vice versa, thus violating the Second
Law of Thermodynamics. The question is "Must the demon necessarily
create more entropy in his decision making process than he destroys by
his operations." Richard Feynman makes a convinvcing argument for a
"yes" answer with his analysis of a "demon" consisting of a rachet and
pawl attached to a paddle wheel. This is in Vol. I of his Introductory
Physics Lectures. The point is that any realization of the demon has to
interact with the gases.

Ilya Prigogine in "From Being to Becoming" talks about the "order out
of chaos" question. He introduces the concept of "dissipative structures"
which are systems driven far from equilibrium by a strong energy flow.
(A Voyager photo of Jupiters swirling bands is on the cover.) Entropy
is only definable near equilibrium (Being) so that with systems
driven far from equilibrium we are faced with new rules (Becoming.)
Note that TTL logic uses dynamic entropy producing states (current sinks)
to define "bits".  So you'd better think long and hard before making
any links between information processing and equilibrium thermodynamics.

Lew Mammel, Jr. ihuxv!lew