[comp.ai.neural-nets] Seminar Announcement

hudak@siemens.UUCP (Michael J. Hudak) (12/03/87)

                         SEMINAR ANNOUNCEMENT


                       Professor David Rumelhart
                       Department of Psychology
                       Stanford University
                       Palo Alto, CA

Title:           Learning and Generalization in PDP Networks

Location:        Siemens Corporate Research & Support, Inc
                 Princeton Forrestal Center
                 105 College Road East
                 Princeton, NJ   08540-6668      (609/734-3373)

                 3rd floor Multi-Purpose Room


Date:            Wednesday December 9, 1987

Time:            10:00 am   (refreshments: 9:45)

hudak@siemens.UUCP (Michael J. Hudak) (12/11/87)

                         SEMINAR ANNOUNCEMENT


                       Alan Lapedes
                       Theoretical Division
                       Los Alamos National Laboratories

Title:           Non-Linear Signal Processing with Neural Networks

Location:        Siemens Corporate Research & Support, Inc.
                 Princeton Forrestal Center
                 105 College Road East
                 Princeton, NJ   08540-6668      


                 3rd floor Multi-Purpose Room


Date:            Friday, 11 December, 1987

Time:            2:00 pm   (refreshments: 1:45)

For more information call Mike Hudak  609/734-3373

hudak@siemens.UUCP (Michael J. Hudak) (01/07/88)

                         SEMINAR ANNOUNCEMENT

Speaker:       Peter Cariani
               Systems Science Dept., Thomas J. Watson School of Engineering
               State University of New York at Binghamton

Title:         Structural Preconditions for Open-Ended Learning
               through Machine Evolution

Location:      Siemens Corporate Research & Support, Inc.
               3rd floor Multi-Purpose Room
               Princeton Forrestal Center
               105 College Road East
               Princeton, NJ   08540-6668      

Date:          Thursday, 14 January 1988

Time:          10:00 am   (refreshments: 9:45)

For more information call Mike Hudak:  609/734-3373

                               Abstract

One  of the  basic  problems  confronting  artificial life  simulations is
the apparent open-ended  nature of structural evolution, classically known
as the problem  of emergence.   Were it possible to construct devices with
open-ended behaviors and capabilities,  fundamentally  new learning  tech-
nologies would become possible.  At present, none of our devices or models 
are open-ended, due to the nature of their design and construction.  

The best  devices we have,  in the form of trainable machines,  neural net
simulations,   Boltzmann  machines  and  Holland-type  adaptive  machines, 
exhibit  learning  within the  categories fixed  by their  feature spaces. 
Learning occurs  through the performance dependent  optimization of alter-
native I/O  functions.   Within  the adaptive  machine  paradigm  of these 
devices,  the measuring devices,  feature spaces, and hence the real world 
semantics of  such devices  are stable.   Such machines  cannot create new 
primitive  categories;  they will  not expand  their feature  and behavior 
spaces.

Over phylogenetic time spans, however,  organisms have evolved new sensors
and effectors,   allowing them to perceive more and more  aspects of their
environments  and to act in more and  more ways  upon those  environments.  
This involves a whole new level of learning: the learning of new primitive 
cognitive and  behavioral categories.   In terms of constructible devices, 
this  level of learning  encompasses machines which  construct and  select 
their own sensors and effectors,  based upon their real world performance.  
The semantics  of the  feature and  behavior spaces of  such devices  thus 
changes so as to optimize their effectiveness as  categories of perception 
and action.  Such devices construct their own primitive categories,  their 
own primitive  concepts.   Evolutionary  devices  could be  combined  with 
adaptive  ones  to both  optimize  primitive  categories and  I/O mappings 
within those categories.

Evolutionary machines  cannot be constructed  through computations  alone.
New  primitive  category  construction  necessitates   that  new  physical 
measuring structures and controls come into being.   While the behavior of 
such  devices can be  represented to a  limited  degree by  formal models, 
those models cannot  themselves create  new categories vis-a-vis  the real 
world,  and hence are  insufficient as  category-creating devices in their 
own right.   Computations must  be augmented by  the physical construction 
of new  sensors and effectors  implementing processes  of measurement  and 
control respectively.   This construction  process must be inheritable and 
replicable,  hence encodable into symbolic form, yet involving the autono-
mous, unencoded dynamics of the matter itself.

The paradigmatic  example of  a natural  construction  process  is  protein 
folding.    A one-dimensional  string of  nucleotides,  itself a  discrete, 
rate-independent symbolic  structure,  is transformed into continuous, rate 
dependent dynamics  having biological  function  through  the action of the 
physical properties inherent in the protein  chain itself.   The functional 
properties of  speed,  specificity,  and  reliability  of action  are  thus 
achieved with  symbolic constraints but  without the  explicit direction of 
rules.