[net.philosophy] Electrons, etc., may TOO be deterministic.

shiue@h-sc1.UUCP (steve shiue) (10/30/85)

> 
>      Electrons, atoms, molecules, biochemical processes, etc..  are not
>      causally deterministic mechanisms, at least if the advances in
>      knowledge about natural phenomena since 1930 are allowable as evidence.
>         
>      Therefore, the idea that "our behavior might be deterministic" is
>      profoundly unscientific.
>      
> -michael


	This is incorrect.  In fact, the chief philosophical
controversy among physicists in quantum mechanics is the argument over
whether quantum mechanics allows a deterministic interpretation of the
evidence.  The standard interpretation of quantum mechanics, which most
scientists subscribe to, is non-deterministic.  However, there have
been so-called "hidden variable" theories proposed in recent years by
some physicists (notably David Bohm) that allow for a deterministic
scheme if the existence of unobservable variables is allowed for.  One
subset of these theories, called local hidden variable theories (local
HVT's), was recently discredited by experiments in France.  The other
subset, called nonlocal HVT, is supported by all current evidence - but
so is the standard interpretation, and there are no experiments
currently devised that can distinguish between the two interpretations,
hence nonlocal HVT's are regarded as nothing more than a philosophical
"high ground".
	It might be argued that nonlocal HVT's should be excluded on
the basis of Occam's Razor:  they assume extra variables that we have
no reason to believe in.  However, it could also be argued that
nonlocal HVT's more than compensate for this failing by adding
determinism to the picture.  There is a good review of this topic in
SCIENCE about 1-2 years back by a physics professor from Syracuse - I
can't seem to dig up my copy or remember his name.  Anyway, I'll post
the reference when I look it up.  The article is good because it
discusses all of the major philosophical points (and the French
experiments), and the math isn't too way out.

			-Steve Shiue

H.L. Mecken's definition of puritanism:  The sneaking suspicion that
somewhere, somehow, someone is having a good time.

ellis@spar.UUCP (Michael Ellis) (11/06/85)

> [Steve Shiue]  >> [Michael Ellis]

>>    Electrons, atoms, molecules, biochemical processes, etc..  are not
>>    causally deterministic mechanisms, at least if the advances in
>>    knowledge about natural phenomena since 1930 are allowable as evidence.
>>         
>>    Therefore, the idea that "our behavior might be deterministic" is
>>    profoundly unscientific.
>>      
>> -michael

>	This is incorrect.  

    Really? 

    Please note in the first paragraph, I have specified `causally
    deterministic mechanisms'. In accord with Hume's notion of `causality'
    as a spatial and temporal conjunction between cause and effect, or
    Einstein's notion as the propagation of influences locally through
    spacetime, I believe it's fair to say that modern science has thrown the
    classical doctrine of determinism (that present state is causally
    determined by past effects) into the Humean flames.  Unless prefer you
    turn the word `cause' into swiss cheese, that is. Take your pick.

    The 1982 Aspect experiment indicates we can toss more into those flames
    than causal determinism. In particular, reductionism (the notion that
    all phenomena can be completely understood by recursive analysis into
    progessively smaller spatial and temporal elements) must be thrown into
    the Humean flames, even on the macroscopic level, unless it can be
    demonstrated that individual quantum events do not affect the high-level
    behavior under study. Note that most of our machines can be understood
    reductionistically BECAUSE WE DESIGNED THEM as hierarchical structures
    whose behavior is determined by strictly causal connections and thus
    relatively free of `unwanted' noncausal effects.
    
    Living things are notoriously nonhierarchical in their design, and I am
    hardly alone in supposing that, during the evolution of life, nonlocal
    interactions may have been put to use in very central organizing roles.
    Frankly, I am not surprised that the problems encountered in the life
    sciences have proven intractable to any primitive cause-and-effect
    analysis that sees everything mechanistically.

    Whether in anticipation of the results of the 1982 Aspect experiment or
    not, during the past few decades or so, the sciences have been
    liberating themselves from the sterility of 17th century dogma and rigid
    reductionistic constraints that set the norms of classical physics as
    THE standard of scientific excellence. QM paved the way.

    In thermodynamics, where we once heard of entropy and gloomy but
    predictable heat-death, we now hear new theories of dissipative
    structures, spontaneous emergence of higher levels of order from chaotic
    substrata, and chemical evolution (Prigogine, Stengers).  Note that
    these are the very processes that underlie macro-level biochemical
    behavior, which are in turn decided by whatever fluctuations occur at
    bifurcation points.
    
    One encounters even greater indeterminism in the recent findings from
    evolutionary biology. The natural autonomy of living organisms,
    evolution driven from within, feedback loops among different levels of
    organization, the natural emergence of subjectivisms like intention,
    perception, and consciousness, and even blasphemies like future
    causality (J. Campbell, Sperry) all point to the demise determinism in
    modern science. You might check out "Evolution at a Crossroads", MIT
    Press (1985, contributors include Mayr, Grene, Ayala..).
    
    I suppose it's understandable that computer professionals would
    number among the last holdouts. Computers are, after all, totally
    deterministic -- that is, until they break. But we shouldn't be blinded
    to the world outside of our deterministic digital world.
    
>In fact, the chief philosophical controversy among physicists in quantum
>mechanics is the argument over whether quantum mechanics allows a
>deterministic interpretation of the evidence.  The standard interpretation
>of quantum mechanics, which most scientists subscribe to, is
>non-deterministic.  However, there have been so-called "hidden variable"
>theories proposed in recent years by some physicists (notably David Bohm)
>that allow for a deterministic scheme if the existence of unobservable
>variables is allowed for.

    David Bohm, in two small but amazing books (Causality and Chance in
    Physics (1957) and Wholeness and the Implicate Order (1980)) proposes
    several visionary theories. Although I cannot claim to really understand
    his ideas, the little that makes sense to me is most pertinent to
    several recent topics. Please flame at any of my misconceptions..
    
    As you have mentioned, Bohm has proposed a "hidden variable" theory in
    which he postulates a substratum beneath the quantum level which, though
    indistinguishable in its predictions to the orthodox Copenhagen theory
    (at least in the experiments to date), diverges substantially from
    Bohr's metaphysical conclusions:
    
    (1) The dynamical qualities (position, momentum, spin) are always fully
        defined, rather than being undefined when not measured, or partially
	determined at the point of measurement.
    (2) The wave function represents a real physical entity, rather than
        being a predictive mathematical device.
    (3) The quantum world is seen as possessing objective dynamic properties
        in its own terms, rather than as a netherworld of dubious ontological
	status that interacts with a suitable classical (macroscopic) 
	measuring device.
    (4) The apparent irreducible lawlessness of quantum randomness
        as well as the nonseparability of correlated particles are both
        explained as manifestations of deeper `implicate order' of 
	higher dimensionality projected into our superficially {3+1}-
	dimensional space. 

    Some have misinterpreted Bohm's theories as a rearguard attempt to
    reinstate the deterministic world of our forefathers -- this is
    mistaken. Bohm insists that nature possesses an inexhaustible depth of
    properties and qualities that no finite system of laws and categories
    can ever express entirely; in particular, the universe has an infinite
    number of nearly autonomous levels of explanation in which the natural
    laws at any level must admit irreducible fluctuations which are only
    explicable by the laws of that level's substratum; likewise, the
    emergent properties of a higher level exist on their own terms and are
    not totally inferrable from the laws of the substratum:
    
	 Each level will to some extent be influenced directly by all other
	 levels, in a way that cannot fully be expressed in terms of their
	 effects on the next lower level quantities alone. Thus, while each
	 level is strongly correlated to the mean behavior of the next lower
	 level, it has some degree of independence.. - David Bohm
    
    Reminiscent of Eastern worldviews, Bohm sees the universe as an
    undivided whole in which indeterminism necessarily results from the
    fragmentation imposed by any attempt to impose a rational scientific
    structure. Nonetheless, rather than discarding the Western analytic
    approach of science, Bohm has worked to expand science (I believe his
    work in the 50's was instrumental in the evolution of Bell's
    interconnectedness principle) beyond its historical mechanistic
    conceptions:
	 
	 Undivided wholeness means we must drop the mechanistic order.. the
	 easily accessible explicit content of consciousness is included
	 within a much greater implicit background. This in turn evidently
	 has to be contained in a yet greater background which may include
	 not only neuro-physical processes at levels which are not generally
	 conscious but also a yet greater background of unknown (and indeed
	 ultimately unknowable) depths of inwardness that may be analogous
	 to the `sea' of energy that fills the sensibly perceived `empty'
	 space. - David Bohm

>One subset of these theories, called local hidden variable theories (local
>HVT's), was recently discredited by experiments in France.
>The other subset, called nonlocal HVT, is supported by all current evidence
>- but so is the standard interpretation, and there are no experiments
>currently devised that can distinguish between the two interpretations,
>hence nonlocal HVT's are regarded as nothing more than a philosophical "high
>ground".
>
>It might be argued that nonlocal HVT's should be excluded on the basis of
>Occam's Razor:  they assume extra variables that we have no reason to
>believe in.  However, it could also be argued that nonlocal HVT's more than
>compensate for this failing by adding determinism to the picture.. 

    I would be interested in hearing about other nonlocal hidden variable
    theories besides Bohm's. I believe that De Broglie's `pilot wave'
    model (which was a starting point in the evolution of Bohm's theory)
    is another HVT that remains viable after the 1982 Aspect experiment.
    Does anyone know about De Broglie's theory?

    Several final comments about the Bohm theory:

      * It does not claim that the world is deterministic, rather it only
        offers deterministic laws within the quantum domain; indeterminism
	is occurs in all domains due to the partial autonomy of levels.
	
      * Its predictions indeed concur with the experiments to date; however,
	Bohm claims that his theory would yield results that differ from
	the standard quantum theory if, say, the apparatus for an EPR-style
	experiment were rotated very rapidly.
	
      * Having shown that all attempts to prove the uniqueness of the
	Copenhagen orthodoxy have been invalid, Bohm has forged ahead to
	create the foundations of a theory which attempts to fully
        embrace the microscopic domain in its own terms.

    I do not think that hard-boiled Occam-wielding scientists would care
    much at all for his ideas.  What is unusual about Bohm's holism is
    his realism (he has been referred to as a cross between Krishnamurti and
    Einstein, both of whom were personal friends of his). 
    
    Critics refer to his model universe as a giant, undifferentiated,
    mystical blob, a "Bohm monster", to which Bohm answers "unless we
    understand the subtleties of wholeness, we will not only divide what
    cannot be divided, we'll try to unite what cannot be united"    

-michael

gwyn@brl-tgr.ARPA (Doug Gwyn <gwyn>) (11/08/85)

>     Please note in the first paragraph, I have specified `causally
>     deterministic mechanisms'. In accord with Hume's notion of `causality'
>     as a spatial and temporal conjunction between cause and effect, or
>     Einstein's notion as the propagation of influences locally through
>     spacetime, I believe it's fair to say that modern science has thrown the
>     classical doctrine of determinism (that present state is causally
>     determined by past effects) into the Humean flames.  Unless prefer you
>     turn the word `cause' into swiss cheese, that is. Take your pick.

Since when is Hume relevant to physics?  His idea of causality
is not necessarily correct; there have been serious criticisms
of it.  Certainly few if any theoretical physicists pay any
attention to Hume's work.

I was unaware that Einstein had such a notion of causality;
could you provide a reference?  Einstein believed in a form
of determinism in physics, but I doubt he would have agreed
with your attempt to define what it means.  He was quite
aware of the degree to which physical "laws" (theories,
actually) constrain the behavior of systems and to what
extent systems are not constrained by the laws.  In the
case of classical (non-quantum) field theories, he even
developed a specific technique for quantifying this.

Neither of the two items you mention preclude deterministic
evolution of physical systems.  However, quantum theory (if
correct!) does appear to do so.

>     The 1982 Aspect experiment indicates we can toss more into those flames
>     than causal determinism. In particular, reductionism (the notion that
>     all phenomena can be completely understood by recursive analysis into
>     progessively smaller spatial and temporal elements) must be thrown into
>     the Humean flames, even on the macroscopic level, unless it can be
>     demonstrated that individual quantum events do not affect the high-level
>     behavior under study. Note that most of our machines can be understood
>     reductionistically BECAUSE WE DESIGNED THEM as hierarchical structures
>     whose behavior is determined by strictly causal connections and thus
>     relatively free of `unwanted' noncausal effects.

No serious thinker proposes reductionism as you state it.
Statistical physicists are quite aware of the problems.

>     Living things are notoriously nonhierarchical in their design, and I am
>     hardly alone in supposing that, during the evolution of life, nonlocal
>     interactions may have been put to use in very central organizing roles.
>     Frankly, I am not surprised that the problems encountered in the life
>     sciences have proven intractable to any primitive cause-and-effect
>     analysis that sees everything mechanistically.

Mysticism has had even less success in the life sciences.

Cause-and-effect does not imply mechanistic.  To say that
certain things happen acausally is tantamount to giving
up any attempt to understand them.  You may of course do
so, but you should not call that "science".

>     Whether in anticipation of the results of the 1982 Aspect experiment or
>     not, during the past few decades or so, the sciences have been
>     liberating themselves from the sterility of 17th century dogma and rigid
>     reductionistic constraints that set the norms of classical physics as
>     THE standard of scientific excellence. QM paved the way.

I doubt very much that many scientists from 1900 onward
know what those dogma and constraints are.  The so-called
philosophers of science have been very slow to catch up.

>     I suppose it's understandable that computer professionals would
>     number among the last holdouts. Computers are, after all, totally
>     deterministic -- that is, until they break. But we shouldn't be blinded
>     to the world outside of our deterministic digital world.

Dijkstra's classic "A Discipline of Programming" makes
explicit use of nondeterminism as a tool for constructing
correctness proofs for programs.

>     Some have misinterpreted Bohm's theories as a rearguard attempt to
>     reinstate the deterministic world of our forefathers -- this is
>     mistaken. Bohm insists that nature possesses an inexhaustible depth of
>     properties and qualities that no finite system of laws and categories
>     can ever express entirely; in particular, the universe has an infinite
>     number of nearly autonomous levels of explanation in which the natural
>     laws at any level must admit irreducible fluctuations which are only
>     explicable by the laws of that level's substratum; likewise, the
>     emergent properties of a higher level exist on their own terms and are
>     not totally inferrable from the laws of the substratum:

This has been obvious to nearly everybody; no reputable
biologist tries to ignore those attributes of reality
that pertain specifically to his field but not to the
supporting fields of chemistry or physics.

Bohm really was trying to salvage determinism (but not
that of our forefathers).

There is much more that could be said to refute much
of the mystical nonsense in the latter part of the
article, but I tire of this.  Could you "philosophers"
PLEASE quit cross-posting this sort of stuff to the
technical newsgroups such as net.math and net.physics?