[net.physics] Concerning Physical versus Informational Entropy

gjphw (10/09/82)

   Before attempting to tackle the question that asks about the information
content of the Universe, allow me to set up the background for understanding
entropy.  
   The appearance of entropy in the physical realm, as referenced in classical
thermodynamics and statistical mechanics, is not the same property that is
labeled entropy in information theory.  The two derive from quite different
bases.  In thermodynamics, entropy is an empirical (experimentally derived)
quantity that is needed to explain the behavior of ideal gases.  It is
associated with the heat capacity and temperature of the substance under
consideration.  Statistical mechanics, which attempts to explain the results
of thermodynamics using dynamics (Newtonian physics, special relativity, etc.),
first introduces the concept of a distribution; here, the distribution is one
of molecules (the existence of which provided a most acrimonious debate at the
turn of the century).  Entropy enters as a measure of this distribution of
molecules.
   Entropy in the information realm is again a measure of a distribution, but
in this case it is a distribution of "bits" (the basic unit of information).
Bits have no physical reality, but the usage of "bits" may still gain for them
an ontological (or metaphysical) status.  Bits are, at worst, mathematical
constructs designed to measure the amount of information.
   In both statistical mechanics and information theory, entropy is a measure
of a distribution.  Their mathematical forms (equations) are almost identical.
But the two entropies are not the same thing (quantity).  One is the entropy of
the distribution of physical molecules, the other is the distribution of
"metaphysical" bits.
   A second requirement for the appreciation of thermodynamics and statistical
mechanics is the principle of thermal equilibrium.  Equilibrium is a condition
where the temperature is uniform within the system under study (temperature is
another empirical quantity defined only within the realm of thermodynamics).
Classical thermodynamics and statistical mechanics are only applicable at, or
very near, thermal equilibrium.  The meaning of entropy, which depends upon the
prerequisites for thermodynamics, outside of these conditions is not well
defined.  And, fortunately for us all, our immediate environment and most of
the known universe fail the thermal equilibrium requirement (the planet is not
always at one temperature, space has isolated hot spots and vast expanses of
very cold volume, etc.).
   Your difficulty with entropy and gaining new knowledge is a problem of
appreciating the physics, not the mathematics.  Philosophers may have a field
day with associating information theory and thermodynamics through their
common definition of entropy.  The two entropies are quite different, and
the entropy defined in physics is difficult to quantify in the absence of the
appropriate conditions for thermodynamics (thermal equilibrium).  With the
understanding presented here, your questions concerning information and the
entropy of the universe have no meaning.
   Nevertheless, if I press on, I am not sure of the significance of new
information.  Is the universe devoid of information unless it is perceived
(does a tree falling in the forest make a sound)?  Is the only measure of
information recorded in our textbooks?  What about other civilizations?
What happened before the appearance of "homo sapiens sapiens"?  The discovery
of new "information", or the refinement of a physical law, in no way affects
the thermodynamic entropy of the universe.  This discovery certainly reduces
information entropy, making information theory quite anthropocentric.  I do
not know of any way to store information in such a way to overcome physical
entropy.  Notice too that binary storage is the least efficient means for
storing information; binary is just the fastest way we humans currently have
for manipulating information.
   'Nough said.