[mod.ai] Seminar - A Stochastic Genetic Search Method

David.Ackley@C.CS.CMU.EDU.UUCP (03/02/87)

                          David H. Ackley
       Carnegie Mellon Computer Science doctoral dissertation defense
                  Tuesday, February 24, 1987 at 1pm
                          Wean Hall 5409

              "Stochastic iterated genetic hillclimbing"

                               Abstract

In the "black box function optimization" problem, a search strategy is
required to find an extremal point of a function without knowing the
structure of the function or the range of possible function values.
Solving such problems efficiently requires two abilities.  On the one
hand, a strategy must be capable of "learning while searching": It must
gather global information about the space and concentrate the search in
the most promising regions.  On the other hand, a strategy must be
capable of "sustained exploration": If a search of the most promising
region does not uncover a satisfactory point, the strategy must redirect
its efforts into other regions of the space.

This dissertation describes a connectionist learning machine that
produces a search strategy called "stochastic iterated genetic
hillclimbing" (SIGH).  Viewed over a short period of time, SIGH displays
a coarse-to-fine searching strategy, like simulated annealing and
genetic algorithms.  However, in SIGH the convergence process is
reversible.  The connectionist implementation makes it possible to
"diverge" the search after it has converged, and to recover
coarse-grained information about the space that was suppressed during
convergence.  The successful optimization of a complex function by SIGH
usually involves a series of such converge/diverge cycles.

SIGH can be viewed as a generalization of a genetic algorithm and a
stochastic hillclimbing algorithm, in which genetic search discovers
starting points for subsequent hillclimbing, and hillclimbing biases the
population for subsequent genetic search.  Several search
strategies---including SIGH, hillclimbers, genetic algorithms, and
simulated annealing---are tested on a set of illustrative functions and
on a series of graph partitioning problems.  SIGH is competitive with
genetic algorithms and simulated annealing in most cases, and markedly
superior in a function where the uphill directions usually lead \away/
from the global maximum.  In that case, SIGH's ability to pass
information from one coarse-to-fine search to the next is crucial.
Combinations of genetic and hillclimbing techniques can offer dramatic
performance improvements over either technique alone.
-------