[net.origins] Now more than ever. PART II

hua@cmu-cs-edu1.ARPA (Ernest Hua) (05/18/85)

______________________________________________________________________

> { from: miller@uiucdcsb.Uiuc.ARPA (A Ray Miller) }
> ...
>
> Sorry, but thermodynamic models have been shown to hold in other
> areas as well.  One of the fields which can make use of this know-
> ledge is information theory.  It is here that we can see the the-
> oretical grounding for evolution crumble.  For example, Dr. Ian
> McDowell, an information engineer, wrote:
>
>      "Communication engineers faced with the problem of coding
> and transmitting a maximum of information on a given channel have
> defined quantitatively the information content of a message.  The
> amount of information to be supplied to transmit any given message
> using symbol x where the probability of any symbol occurring is
> P(x) = H(x) = SIGMA P(x) . log2 P(x) which is the negative of the
> usual entropy formula of thermodynamics.  This represents a defin-
> ite relationship, and it has been found that the equivalence be-
> tween entropy in thermodynamics and information in a binary message
> code is given by the equation:
>
>    1 nit [unit of information] = 1.37 * 10^-16 erg / degree C.
>
>      The degree of order [nonrandomness] in a closed system may
> be described uniquely, and this description contains a measurable
> amount of information.  As the amount of energy available to do
> useful work within a system decreases, entropy increases and the
> information needed to describe the remaining order in the system
> decreases at precisely the negative of the entropy increase.
> Imagine the traditional `Maxwell Demon' who opens and closes a
> little door in the wall of a closed vessel containing gas under
> pressure every time a molecule of gas within a certain velocity
> range approaches the door, thus sorting out molecules in terms
> of velocity and decreasing the entropy of the system.  Obviously
> the `demon' must be preprogrammed to do as he does.  The informa-
> tion needed to specify his operation of the door is equivalent to
> the decrease in entropy within the system which he achieves by
> that operation.  Similarly, the vast amount of information needed
> to pre-program the decrease in entropy which all living creatures
> bring into the closed system of the universe has been precoded
> upon the genes and could, conceivably, be measured.  Evolution,
> said to begin without any such pre-programming whatsoever, runs
> counter to the findings of every thermodynamicist and communica-
> tions engineer.  Every thermodynamic closed system approaches
> the heat death; and no communications engineer ever sent a mean-
> ingful message with a monkey at the keyboard."

Obviously, this engineer is not a scientist.  Here is the gross error:

What exactly is the definition of "information"?  Given the following
set of numbers, could you tell me which one(s) contains information
and which one(s) does not?

      (36, 48, 00, 02)

      (29, 81, 04, 06)

      (12, 25, 19, 65)

You can check your responses with the answers at the end of this
article.  The point of this quiz is quite simple.  Just as in the
design argument, the fundamental subject here is highly subjective.
Therefore, it is not a valid argument.

By the way, engineering principles do not necessarily apply to
biological processes.  An essential part of engineering is the
practical application.  Subjective descriptions are necessary,
rather than intolerable, since something "useful" must even-
tually be generated.  Information is a good example.  The ex-
ample given in your quote deals with information content.  It
fails to give any details of exactly what "vast amounts" of
information really is.  It also fails to give the method of
translation from the information (which is generally considered
to be strictly abstract) to the physical molecular structures
inside a cell.  Furthermore, it fails to give the precise
definition of a "nit".  (What does a "nit" correspond to?  What
information does a "nit" supply?)  Of course, the biggest error
is the failure to define the units of information, of which
there are "vast amounts".

In any case, this argument is just the old design argument with
a new twist.  Unless you really want to pursue this (despite its
invalidity), I am not going to waste my time on the expansions of
the argument.
______________________________________________________________________

Correct answers:

      (36, 48, 00, 02)  This is the numeric net address for the host
                        "lots-b" at Stanford University, California:

                                [36.48.0.2]

      (29, 81, 04, 06)  This is a set of random numbers and does not
                        contain any information.

      (12, 25, 19, 65)  These are the digit pairs that form my birth-
                        date:

                                12/25/1965
______________________________________________________________________

Keebler { hua@cmu-cs-gandalf.arpa }

js2j@mhuxt.UUCP (sonntag) (05/20/85)

> > { from: miller@uiucdcsb.Uiuc.ARPA (A Ray Miller) }
> > ...
> > Sorry, but thermodynamic models have been shown to hold in other
> > areas as well.  One of the fields which can make use of this know-
> > ledge is information theory.  It is here that we can see the the-
> > oretical grounding for evolution crumble.  For example, Dr. Ian
> > McDowell, an information engineer, wrote:
> >
> >      "Communication engineers faced with the problem of coding
> > and transmitting a maximum of information on a given channel have
> > defined quantitatively the information content of a message.  The
> > amount of information to be supplied to transmit any given message
> > using symbol x where the probability of any symbol occurring is
> > P(x) = H(x) = SIGMA P(x) . log2 P(x) which is the negative of the
> > usual entropy formula of thermodynamics.

    The formula you've given for the information content of a message
seems to have been slightly garbled.  Since I'm one of those 'communication
engineers' you reference, allow me:
    The information content of a symbol is:  -log2 P(x).  Thus if you have
only two symbols, occuring with equal likelyhood, the probability of either 
symbol occuring is .5, and the information content in an occurence of a
symbol is -log2(.5), or one bit.  If you have 16 equally likely symbols, 
an occurance of any symbol carries -log2(1/16) or 4 bits.  The information
carried by a message is simply the sum of the information carried by each
symbol occuring in the message.
    I don't have the definition of entropy at hand right now, though I'm
the one who posted it earlier, but the definition I've seen bears no
resemblance to the basic formulas of information theory.  Could you supply
the 'usual entropy formula of thermodynamics' which you reference please?
-- 
Jeff Sonntag
ihnp4!mhuxt!js2j
    "Time has passed, and now it seems that everybody's having those dreams.
     Everybody sees himself walking around with no one else." - Dylan