[comp.ai] stochastic anthropoid principle

silber@sbphy.ucsb.edu (04/03/89)

a recent sequence of submissions has discussed the "anthropic"
principle.  This 'anthropic principle' has been said to have two
forms: 'weak' (we are here, so the universe supports the evolution
of intelligent strife), 'strong' (the laws of the universe MUST
support the emergence of intelligent strife)

Why not add a third ( [& ...]), the stochastic anthropoid principle,
according to which in a statistical ensemble of universes, some
universes are so constructed as to admit, grudgingly, the mergence of
intelligent life.