[comp.ai.neural-nets] Neuron Digest V6 #24

neuron-request@HPLABS.HP.COM ("Neuron-Digest Moderator Peter Marvit") (04/10/90)

Neuron Digest	Monday,  9 Apr 1990
		Volume 6 : Issue 24

Today's Topics:
		      re: sensory input vs. memory
			    Neural Net Chips?
			   New Book announced
	      PSYCOLOQUY: 2 Positions Available (97 lines)
			     Call for Papers
			     CALL FOR PAPERS
			    TR announcements
		       Technical Report available
		       Post-Doc position available


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: re: sensory input vs. memory
From:    ashley@cheops.eecs.unsw.oz.au (Ashley Aitken)
Date:    Mon, 02 Apr 90 12:14:07 -0500


Stephen Smoliar asks "... can a phenotype born without any sensory input
possibly have any memory?"

As always there are many ways to interpret what one means by terms such
as "memories". Surely, if you have no sensory input (and you haven't
initialized the system from a previously sensory capable system, as
others have suggested) then there can be no memories of sensory or
external events ie. there can be no sensory memories.

However, when one considers how reflexive the brain is and, in
particular, the fact that well over 90% of inputs to the cortex are from
the cortex itself [Braitenberg V., any of his articles or books, almost],
one is tempted to think that the process would continue and form memories
(categorize, and do all the other neat things the brain does) with
internal events (and by this I do not mean visceral inputs). Perhaps in
the same way we can remember "when we thought about ...."

Clearly, there would be no understanding (via memory) of motor events
because the brain would be completely oblivious to them (ie. they would
not exist!) as an external actions.

Lastly, I should just add that I find it very difficult to comprehend
this world of internal sensation and how they could have any *useful*
memories.  However, I can also understand why one with sensory input
would find it difficult to understand one without, in a similar way to
how we have trouble understanding bats picture of the world.

Regards,
Ashley Aitken.

E-MAIL  ashley%cheops.eecs.unsw.oz@uunet.uu.net			   ARPAnet
	ashley@cheops.eecs.unsw.oz.au				   ARPAnet

POSTAL	School of Electrical Engineering and Computer Science,
	University of New South Wales,		AUSTRALIA.			

[ 
  "Without sensation and the ability to process sensation, there can be no
   memory."
   I assume here that "the ability to process sensation" refers to somewhat
   early processing of sensory input. The other extreme would seem to imply
   that even the memory process itself is processing sensation, and clearly
   without that there can be no memories.
]


------------------------------

Subject: Neural Net Chips? 
From:    shriver@usl.edu (Shriver Bruce D)
Date:    Sat, 07 Apr 90 07:51:19 -0500


Please recommend other bulletin boards that you think are also
appropriate.
===============================================================

I am interested in learning what experiences people have had using neural
network chips.  In an article that Colin Johnson did for PC AI's
January/February 1990 issue, he listed the information given below about
a number of NN chips (I've rearranged it in alphabetical order by company
name).  This list is undoubtedly incomplete (no efforts at universities
and industrial research laboratories are listed, for example) and may
have inaccuracies in it.

Such a list would be more useful if it would contain the name, address,
phone number, FAX number, and electronic mail address of a contact person
at each company would be identified.

Information about the hardware and software support (interface and
coprocessor boards, prototype development kits, simulators, development
software, etc.) is missing.

Additionally, pointers to researchers who are planning to or have
actually been using these or similar chips would be extremely useful. I
am interested in finding out the range of intended applications.

Could you please send me:

  a) updates and corrections to the list
  b) company contact information
  c) hardware and software support information
  d) information about plans to use or experiences with having used 
     any of these chips (or chips that are not listed)

In a few weeks, if I get a sufficient response, I will resubmit an
enhanced listing of this information to the bulletin boards to which I
originally sent this note.  Thanks,

Bruce Shriver (shriver@usl.edu) 
=================================================================

Company:       Accotech
Chip Name:     AK107
Description:   an Intel 8051 digital microprocessor with its on-
               chip ROM coded for neural networks
Availability:  available now
Company:       Fujitsu Ltd.
Chip Name:     MB4442
Description:   one neuron chip capable of 70,000 connections per
               second
Availability:  available in Japan now

Company:       Hitachi Ltd.
Chip Name:     none yet
Description:   information encoded in pulse trains
Availability:  experimental

Company:       HNC Inc.
Chip Name:     HNC-100X
Description:   100 million connections per second
Availability:  Army battlefield computer

Company:       HNC
Chip Name:     HNC-200X
Description:   2.5 billion connections per second
Availability:  Defense Advanced Research Projects Agency (DARPA)
               contract

Company:       Intel Corp
Chip Name:     N64
Description:   2.5 connections per second 64-by-64-by-64 with
               10,000 synapses
Availability:  available now

Company:       Micro Devices
Chip Name:     MD1210
Description:   fuzzy logic combined with neural networks in its
               fuzzy comparator chip
Availability:  available now

Company:       Motorola Inc.
Chip Name:     none yet
Description:   "whole brain" chip models senses, reflex, instinct-
               the "old brain"
Availability:  late in 1990

Company:       NASA, Jet Propulsion Laboratory (JPL)
Chip Name:     none yet
Description:   synapse is charge on capacitors that are refreshed
               from RAM
Availability:  experimental

Company:       NEC Corp.
Chip Name:     uPD7281
Description:   a data-flow chip set that NEC sells on PC board
               with neural software
Availability:  available in Japan

Company:       Nestor Inc.
Chip Name:     NNC
Description:   150 million connections per second, 150,000
               connections
Availability:  Defense Dept. contract due in 1991

Company:       Nippon Telephone and Telegraph (NTT)
Chip Name:     none yet
Description:   massive array of 65,536 one-bit processors on 1024
               chips
Availability:  experimental

Company:       Science Applications International. Corp.
Chip Name:     none yet
Description:   information encoded in pulse trains
Availability:  Defense Advanced Research Projects Agency (DARPA)
               contract

Company:       Syntonic Systems Inc.
Chip Name:     Dendros-1
               Dendros-2
Description:   each has 22 synapses, two required by any number can
               be used 
Availability:  available now


------------------------------

Subject: New Book announced
From:    Matthew Zeidenberg <zeiden@cs.wisc.edu>
Date:    Fri, 30 Mar 90 11:48:22 -0600

My book "Neural Networks in Artificial Intelligence" has recently been
published by Ellis Horwood Ltd., Chichester UK. They are now a division
of Simon and Schuster, who will be distributing the book in the U.S.  The
price is $39.95, and the ISBN is 0-13-612185-3. It is available from your
bookseller, or direct from Simon and Schuster Mail Order Sales Dept., 200
Old Tappan Rd., Old Tappan NJ 07675. European ordering address: Ellis
Horwood Ltd., Market Cross House, Cooper St., Chichester, West Sussex,
PO19 1EB, England (Price in pounds: 24.95)

The book is a relatively consise (268 pp.) introduction to network
models, yet manages to cover most of the best-known network paradigms,
and is applications-oriented, with a chapter devoted to each of several
areas in AI.
					Matt Zeidenberg
P.S.
Here is the table of contents: 

Chapter 1
Issues in Neural Network Modeling	15
1.1.	Introduction	15
1.2.	The Statistical Nature of Connectionist Models	17
1.3.	Relevance of the Brain	19
1.4.	Distributed vs. Local Connectionism	20
1.5.	Distributed Models: A Critique	26
1.6.	Connectionist Models and the Fuzzy
	Propositional Approach	27
1.7.	Philosophical Issues	29
1.8.	Smolensky's "Proper Treatment" of 
	Connectionism	29
1.9.	Connectionism: A New Form of
	Associationism?	35

Chapter 2
Neural Network Methods for Learning and Relaxation	41
2.1.	Introduction	41
2.2.	Types of Model Neurons	46
2.3.	Types of Activation Rules	48
2.4.	Early Learning Models	49
2.5.	Hebbian and Associative Learning	51
2.6.	Kohonen's Work on Associative Learning	54
2.7.	Willshaw's Binary Associator	56
2.8.	Hopfield's Non-linear Auto-associator	56
2.9.	Modeling Neurons with Differential Equations	60
2.10.	Simulated Annealing in the Boltzmann
	Machine	62
2.11.	Learning Weights in the Boltzmann Machine	64
2.12.	Error Back-Propagation	67
2.13.	Applications of Back-propagation	70
2.14.	Learning Family Relationships	71
2.15.	Competitive Learning	73
2.16. 	Competitive Learning using
	Feed-forward Networks	73
2.17.	Competitive Learning using
	Adaptive Resonance Theory	78
2.18.	Kohonen's Self-organizing Topological Maps	82
2.19.	A Population Biology
	Approach to Connectionism	86
2.20.	Genetic Algorithms	91
2.21.	Reinforcement Algorithms	94
2.22.	Temporal Difference Methods	98
2.23.	Problem-Solving Using Reinforcement and
	 Back-propagation
2.24.	Problem-Solving Networks	107
2.25.	Extensions to Learning Algorithms	111
2.26.	Escaping From Local Minima	112
2.27.	Creating Bottlenecks	113
2.28.	Sequential Learning	115
2.29.	Remembering Old Knowledge	117
2.30.	Sequential Processing	120
2.31.	Image Compression Using a
	Back-propagation Auto-associator	122
2.32. 	Representing Recursive Structures	123

Chapter 3
Production Systems and Expert Systems	127
3.1.	Introduction	127
3.2.	A Connectionist Production System	128
3.3.	Saito and Nakano's Connectionist Expert
	System	131
3.4.	Gallant's Connectionist Expert System	134

Chapter 4
Knowledge Representation	138
4.1.	Introduction	138
4.2.	Storing Schemata in Neural Networks	139
4.3.	Storing Frames in Neural Networks	140
4.4.	Storing Schemata with a
	Complex Neural Architecture	144
4.5.	Learning Microfeatures for
	Knowledge Representation	148
4.6.	Implementing Evidential Reasoning
	and Inheritance Hierarchies	151

Chapter 5
Speech Recognition and Synthesis	157
5.1.	Introduction	157
5.2.	Comparing Algorithms for Speech
	Recognition	158
5.3.	Speech Recognition as Sequence Comparison	160
5.4.	The Temporal Flow Model	163
5.5.	The TRACE model	165
5.6.	A Model of the Print-to-speech
	Transformation Process	168
5.7.	NETtalk: Reading Aloud with
 	a Three-Layer Perceptron	172

Chapter 6
Visual Perception and Pattern Recognition	177
6.1.	Introduction	177
6.2.	Interpreting Origami Figures	178
6.3.	Recognition Cones	183
6.4.	Separating Figure from Ground	185
6.5.	Determining "What" and "Where" 
	in a Visual Scene	188
6.6.	Linking Visual and Verbal Semantics	192
6.7.	Recognizing Image-schemas	193

Chapter 7
Language Understanding	195
7.1.	Introduction	195
7.2.	Processing Finite State Grammars
	Sequentially	200
7.3.	Sentence Interpretation	205
7.4.	Word Sense Disambiguation	210
7.5.	Making Case Role Assignments	212
7.6.	The MPNP Parsing System	215
7.7.	Parsing Strings from Context-Free Grammars	218

7.8.	PARSNIP: A Parsing System
	Based on Back-propagation	221
7.9.	A Quasi-Context-Free Parsing System	223
7.10.	Parsing Using a Boltzmann Machine	225
7.11.	Learning the Past Tense	227
7.12.	A Critique of "Learning the Past Tense"	230
7.13.	Letter and Word Recognition	232

------------------------------

Subject: PSYCOLOQUY: 2 Positions Available (97 lines)
From:    Stevan Harnad <harnad%Princeton.EDU@VM.TCS.Tulane.EDU>
Date:    Sun, 01 Apr 90 14:22:35 -0400

*** PSYCOLOQUY: Sponsored on an experimental basis by the Science
Directorate of the American Psychological Association 202/955-7653 ***

1.  Postdoctoral position in cognitive neuroscience -- San Diego
2.  One-Year Position in Cognitive Psychology -- Haverford College
 -------------------------------------------------------------------------
1. Subject: POSTDOCOTRAL POSITION IN COGNITIVE NEUROSCIENCE -- SAN DIEGO
From: trejo@nprdc.navy.mil (Leonard J. Trejo)

 POSTDOCTORAL POSITION IN COGNITIVE NEUROSCIENCE -- SUMMER/FALL 1990

	  The Neurosciences Division of the  Navy  Personnel  Research  and
Development Center (NPRDC), San Diego, is looking for a recent Ph. D.  to
study electrophysiological correlates of human cognition.  Ongoing
research includes neuroelectric (EEG and ERP) and neuromagnetic (evoked
field) technology.  The primary emphasis is on the improvement of on-job
performance prediction and training; however, considerable emphasis is
given to basic research issues.  Another area of interest is in real-time
electrophysiological signal processing using adaptive filters and neural
networks.  The well-equipped Neuroscience Labora- tory includes two
Concurrent computer systems, several '386 PC sys- tems, a Macintosh SE,
and other equipment, as well as extensive stimulus presentation, data
acquisition and analysis software.  Access privileges to VAX 11/780, IBM
4341, and SUN 4 systems, and the INTER- NET network are also available.
An associate investigator role will be assumed by the successful
candidate and he/she will be expected to develop a line of research in
concert with Center goals.

	 Qualifications include:
	   1.  U. S. Citizenship
	   2.  Ph. D., Sc. D., or equivalent in psychology or neuroscience
	       received not more than 7 years from date of award

	 Additional experience desired:
	   1.  Cognitive psychophysiology training / experience
	   2.  Experimental design / methodology
	   3.  Multivariate / univariate statistics
	   4.  Proficiency with UNIX and C programming

	  The position is available  through  the  Postdoctoral  Fellowship
Program funded by the U.S. Navy Office of Naval Technology (ONT) and
administered by the American Society for Engineering Education (ASEE).
Duration of the appointment is for one year, and may be renewed for up to
two additional years.  Stipends range from $34,000 to $38,000 per annum
depending upon experience.  A relocation allowance may be nego- tiated;
the amount is based on the personal situation of the partici- pant.
Funds will be available for limited professional travel.

	  NPRDC is located on top of Pt. Loma, overlooking San Diego Harbor
and downtown San Diego.  Reasonably priced rental housing is available in
within a 5-mile radius of the Center.  San Diego offers an excel- lent
climate and environment as well as a wide range of academic, mil- itary,
and industrial research institutions.

	  The application deadlines are April 1, 1990, for terms  beginning
in the summer, and July 1, 1990, for terms beginning in the fall.  For
information about the ONT Postdoctoral Fellowship Program and an
application form, please contact:

	 American Society for Engineering Education
	 Projects Office, Attention:  Bob Davis
	 11 Dupont Circle, Suite 200
	 Washington, DC 20036
	 (202) 293-7080

For information about the NPRDC Neurosciences Division, contact:

	 Dr. Leonard J. Trejo
	 Neuroscience Division, Code 141
	 Navy Personnel Research and Development Center
	 San Diego, CA 92152-6800
	 (619) 553-7711

INTERNET: trejo@nprdc.navy.mil	  UUCP: ucsd!nprdc!trejo
 ----------------------------------------------------------------

2. One-Year Position in Cognitive Psychology -- Haverford College
From: D_DAVIS%hvrford.bitnet (Douglas Davis)

Haverford College is seeking a one-year sabbatical leave replacement for
the 1990-91 academic year in the area of cognitive psychology. An
emphasis on ecological approaches would be especially appropriate.
Candidates should be able to teach Introductory Cognitive Psychology,
Experimental Design, Memory & Cognition, and an advanced course in some
other area of cognition such as Psycholinguistics, as well as supervise
senior thesis research.  Haverford College is a small, highly selective
liberal arts college in the Philadelphia suburbs. Salaries are
competitive, and fringe benefits are excellent. Haverford College is an
Equal Opportunity/Affirmative Action employer.

Interested candidates should send a vita and 3 letters of reference to:
Douglas Davis, Chair, Department of Psychology, Haverford College,
Haverford PA l9041. (215) 896-1236

------------------------------

Subject: Call for Papers
From:    Elan Moritz <71620.3203@CompuServe.COM>
Date:    01 Apr 90 23:07:09 -0400

please post and distribute to investigators of:
========================== 

        * human and machine intelligence
        * knowledge systems
        * computational linguistics
        * natural languages
        * theoretical biology
        * population genetics
        * ethology / cultural ecology
        * information storage and transfer
        * learning and teaching systems
        * philosophy of knowledge
        * philosophy and history of science

 
        ++++++++++++++++++++++++++++++++++++++++++++++++++

                        NEW JOURNAL ANNOUNCEMENT

                                   &

                            CALL FOR PAPERS

                         ......................
                         .                    .
                         .  JOURNAL of IDEAS  .
                         ......................



         IMR, BOX 16327, PANAMA CITY, FLORIDA 32406, USA

        ++++++++++++++++++++++++++++++++++++++++++++++++++


     The Institute for Memetic Research [IMR] is publishing a new journal
called 'Journal of Ideas'. The main purpose of the journal is to provide
an archival forum for discussion of the genesis, evolution, competition
and death of 'ideas' and 'memes'. The term 'idea' is one that requires
careful discussion. The original term 'meme' [pronounced: meem] is a
conceptual construct introduced by Richard Dawkins to describe units of
cultural transmission and imitation. IMR uses the term 'meme' as a point
of departure for an area we call 'Memetic Science'. Ultimately, 'meme'
requires further definition and clarification. The primary thesis of
Memetic Science is that 'ideas' and 'memes' are entities that are
functionally similar to biological genes in their ability to replicate,
mutate, and undergo natural selection.  What are sought in Memetic
Science are: rigorous quantitative foundations, theory, and experimental
methodology and measurements.

     The history of the study of 'ideas'-as-entities-by-themselves is
ancient. As philosophers, we have a variety of qualitative theories and
speculations. Logic theory, philology, modern linguistics, and computer
oriented technologies, have provided a start in the area of understanding
structures, grammars, and truth conditions of sentences and small
collections of sentences. Population geneticists and biologists have
provided initial models for spread of 'cultural' constructs. These models
incorporate the techniques of dominant/recessive allele spreading in
genetic pools and epidemiological approaches. Some models use compound
constructs of 'gene + culture' elements as the particulate elements that
replicate and propagate. While the contributions from these diverse
disciplines are useful, there are needs for systematic, robust and, most
importantly, quantitative approaches.

     Present day applications of Memetic Science include both human
aspects of replication, mutation, competition, spread and death of ideas
and memes, as well as their electronic analogs. The 'electronic memes'
are beneficial messages, reusable subroutines, programs that are freely
[or surreptitiously] copied and modified, computer viruses, worms, trojan
horses, etc.


     To address the needs stated above the Institute for Memetic Research
is launching the Journal of Ideas (first issue printing, September 1990).
The detailed statement of scope, pivotal references, subscription
information, and instruction for authors is available upon written
request from:

        Elan Moritz, Editor
        Journal of Ideas
        The Institute for Memetic Research
        Box 16327
        Panama City, Florida  32406, USA


        email address:    INTERNET: 71620.3203@compuserve.com
        or                    INET: 71620.3203@compuserve.com

     The Journal of Ideas will appear [initially] quarterly, and will
contain the following reqular sections:

1) Invited papers, 2) Research Contributions, 3) Rapid Publications
and 4) Discussion of persistence and spread of existing 'Major Ideas'.


Only previously unpublished papers will be accepted.

Page charges for invited papers will be waived.

Brevity, and jargon accessible to interdisciplinary researchers are encouraged.

Standard transfer of copyrights is required prior to printing.
    
     To encourage participation and discussion of this new area, IMR/JoI will 
experiment with two categories of papers. One category will be strictly
reviewed and refereed, while another will be reviewed by the editor but
not refereed. Non-refereed papers will be so marked; they will have the
advantages of rapid publication and possible disadvantages of archival of
errors.

      To expedite processing, authors can immediately submit papers prepared
according to a standard professional society [e.g. IEEE, AIP, APS]
journal manuscript format. Three copies are required. On an experimental
basis, authors who would like to submit papers for rapid publication
using email may submit papers using the internet address [INTERNET:
71620.3203@compuserve.com]. These papers should consist of ASCII text
only, with equations built up carefully using ASCII text. Papers
submitted through email should be followed up by submitting a written
version via regular postal channels.
 

      Readers of this message are encouraged to suggest topics and individuals
[including themselves] to be considered for invited papers.


                                   


------------------------------

Subject: CALL FOR PAPERS
From:    "Centro de Inteligencia Artificial(ITESM)" <ISAI@TECMTYVM.MTY.ITESM.MX>
Organization: Instituto Tecnologico y de Estudios Superiores de Monterrey
Date:    Wed, 04 Apr 90 12:11:38 -0600


[[ Editor's Note:  Since this has been posted previously, I'm not
including the lengthy list of program participants. -PM ]]

  Call for Papers
             Third International Symposium on
               Artificial Intelligence:
 Applications of Engineering Design, Manufacturing & Management in
             Industrialized and Developing Countries

             October 22-26, 1990
                ITESM, MEXICO

   The Third International Symposium on Artificial Intelligence will
   be held in Monterrey, N.L. Mexico on October 22-26, 1990.
   The Symposium is sponsored by the ITESM (Instituto Tecnologico y
   de Estudios Superiores de Monterrey)  in cooperation with the
   International Joint Conferences on Artificial Intelligence Inc.,
   the American Association for Artificial Intelligence, the Sociedad
   Mexicana de Inteligencia Artificial and IBM of Mexico.

   Papers from all countries are sought that (1) present innovative
   applications of artificial intelligence technology to the solution
   of industrial problems in engineering design, manufacturing and
   management; (2) explore its relevance for developing countries;
   and  (3)  describe research on techniques to accomplish such
   applications.

   AREAS OF APPLICATION include but are not limited to:

  * Production planning,* resource management, * quality management,
  * automated assembly, * machine loads, * inventory control,
  * computer aided product design, *computer aided product manufacturing
  * human resources management, * forecasting, *client/customer support,
  * process control and ES, * automatic process inspection, * use of
    industrial robots, * market and competition analysis, * strategic
    planning of manufacturing, * technology management and social impact
    of AI technology in industrial environments.

  AI TECHNIQUES include but are not limited to:
   * Knowledge acquisition and representation, * natural language
     processing, * robotics , * speech recognition, * computer vision,
   * neural networks and genetic algorithms, * parallel architectures,
   * automatec learning, * automated reasoning, * search and problem
     solving, * knowledge engineering tools and methodologies,
   * uncertainty management and AI programming languages.


   Persons wishing to submit a paper should send five copies written
   in english to:
                      HUGO TERASHIMA
                      PROGRAM CHAIR
                      CENTRO DE INTELIGENCIA ARTIFICIAL, ITESM
                      SUCURSAL DE CORREOS "J", C.P. 64849
                      MONTERREY, N.L. MEXICO

  The paper should identify the area and technique to which it belongs.
  Extended abstract is not required.Use font similar to "times",size 12
  single-spaced, with a maximum of 10 pages. No papers will be accepted
  by electronic means.

  IMPORTANT DATES:
  Papers must be received by April 30, 1990.Papers received after the
  deadline will be returned unopened. Authors will be notified by
  June 30, 1990. A final copy of each accepted paper, camera ready
  for inclusion in the Symposium proceedings will be due by July 30,
  1990.
                   INFORMATION.-
                   CENTRO DE INTELIGENCIA ARTIFICIAL, ITESM.
              SUC. DE CORREOS "J", C.P. 64849 MONTERREY, N.L. MEXICO.
              TEL (52-83) 58-20-00 EXT.5134.
              TELEFAX (52-83) 58-07-71, (52-83) 58-89-31,
              NET ADDRESS:
                          ISAI AT TECMTYVM.BITNET
                          ISAI AT TECMTYVM.MTY.ITESM.MX

     GENERAL CHAIR:
             Francisco J. Cantu-Ortiz, ITESM, Mexico


------------------------------

Subject: TR announcements
From:    "Songnian Qian" <sqian%demos@lanl.gov>
Date:    Thu, 05 Apr 90 10:16:48 -0600


                        ANNOUNCEMENT

Following techenical reports are avialable to public:

   1) FUNCTION APPROXIMATION WITH AN ORTHOGONAL BASIS NET

                         Abstract

        An orthogonal basis net (OrthoNet) is studied for function 
        approximation.  The network transfers input space to a new 
        space in which the orthogonal basis function is easy to construct.
        This net has the advantages of fast and accurate learning,
        the ability to deal with high dimensional systems and
        has only one minimum so that local minima are not attractors
        for the learning algorithm.

    2) Adaptive Stochastic Cellular Automata: Theory

                          Abstract

        The mathematical concept of cellular automata has been generalized
        to allow for the possibility that the uniform local interaction
        rules that govern  conventional cellular automata are replaced by
        nonuniform local interaction rules which are drawn from the same
        probability distribution function, in order to guarantee the
        statistical homogeneity of the cellular automata system. Adaptation
        and learning in such a system can be accomplished by evolving the
        probability distribution function along the steepest descent
        direction of some objective function in a statistically unbiased
        way to ensure that the cellular automata's dynamical behavior
        approaches the desired behavior asymptotically.
        The proposed CA model has been shown mathematically to possess
        the requisite convergence property under general conditions.

     3) Adaptive Stochastic Cellular Automata: Applications

                          Abstract

         The stochastic learning cellular automata model has been applied
         to the problem of controlling unstable systems.  Two example unstable 
         systems are controlled by an adaptive stochastic cellular 
         automata algorithm with an adaptive critic.  The reinforcement 
         learning algorithm and the architecture of the stochastic CA 
         controller are presented.  Learning to balance a single pole is 
         discussed in detail.  Balancing an inverted double pendulum 
         highlights the power of the stochastic CA approach.  The stochastic 
         CA model is compared to conventional adaptive control and 
         artificial neural network approaches.


      Please email your requist to sqian@merlin.lanl.gov or mail your requist
to
                       Songnian Qian
                       Center for Nonlinear Studies
                       Mail Stop B-258
                       Los Alamos National Laboratory
                       Los Alamos, New Mexico 87545



------------------------------

Subject: Technical Report available
From:    Melanie Mitchell <mm@cogsci.indiana.edu>
Date:    Fri, 06 Apr 90 16:28:52 -0500

The following technical report is available from the Center for Research on
Concepts and Cognition at Indiana University:

     The Right Concept at the Right Time:  How Concepts Emerge as Relevant 
                  in Response to Context-Dependent Pressures
		            (CRCC Report 42)

                Melanie Mitchell and Douglas R. Hofstadter
              Center for Research on Concepts and Cognition
                            Indiana University

                                  Abstract

     A central question about cognition is how, when faced with a situation, 
     one explores possible ways of understanding and responding to it.  In 
     particular, how do concepts initially considered to be irrelevant, or 
     not even considered at all, become relevant in response to pressures 
     evoked by the understanding process itself?  We describe a model of 
     concepts and high-level perception in which concepts consist of a central
     region surrounded by a dynamic nondeterministic "halo" of potential 
     associations, in which relevance and degree of association change as 
     processing proceeds.  As the representation of a situation is 
     constructed, associations arise and are considered in a probabilistic 
     fashion according to a "parallel terraced scan", in which many routes 
     toward understanding the situation are tested in parallel, each at a rate
     and to a depth reflecting ongoing evaluations of its promise.  We 
     describe Copycat, a computer program that implements this model in the 
     context of analogy-making, and illustrate how the program's ability to 
     flexibly bring in appropriate concepts for a given situation emerges from 
     the mechanisms that we are proposing.

(This paper has been submitted to the 1990 Cognitive Science Society 
conference.)

To request copies of this report, send mail to 

mm@cogsci.indiana.edu or mm@iuvax.cs.indiana.edu

or 

Melanie Mitchell
Center For Research on Concepts and Cognition
Indiana University
510 N. Fess Street
Bloomington, Indiana 47408

------------------------------

Subject: Post-Doc position available
From:    Dan Kersten <kersten@eye.psych.UMN.EDU>
Date:    Mon, 09 Apr 90 10:08:48 -0500


                    UNIVERSITY OF MINNESOTA

               POST-DOCTORAL RESEARCH POSITIONS

Two research positions available to study the linkages between the
initial stages of human perception and later recognition. The research
uses psychophysical and computational methods to understand these
problems.

Applicants must have a Ph.D.  Background in computer modeling,
psychoacoustics, visual psychophysics, perception, or supercomputers is
highly desirable.  Applicants capable of forging links between audition
and vision will be given consideration. The research will be conducted at
the Center for the Analyses of Perceptual Representations (CAPER) at the
University of Minnesota .  This Center encompasses four vision
laboratories and one hearing laboratory in the Psychology and Computer
Science departments, and includes ample facilities for simulation and
experimental studies. Center faculty members are: Irving Biederman,
Gordon Legge, Neal Viemeister, William Thompson, and Daniel Kersten.
Salary level: $26,000 to $32,000 depending on the candidate's
qualifications and experience.  Appointment is a 100% time, 12-month
appointment as post-doctoral fellow. (Appointment may be renewable,
contingent on satisfactory performance and AFOSR funding.)  Starting date
is July 1, 1990 or as soon as possible.

Candidates should submit a vita, three letters of reference,
representative reprints and preprints, and a statement of long-term
research interests to:

	Professor Irving Biederman, 
	Department of Psychology, 
	University of Minnesota, 
	75 East River Road, 
	Minneapolis, Minnesota, 55455. 

Applications must be received by June 15, 1990.

The University of Minnesota is an equal opportunity educator and employer
and specifically invites and encourages applications from women and
minorities.

------------------------------

End of Neuron Digest [Volume 6 Issue 24]
****************************************