[comp.ai.neural-nets] Neuron Digest V5 #34

neuron-request@HPLABS.HP.COM ("Neuron-Digest Moderator Peter Marvit") (08/22/89)

Neuron Digest	Monday, 21 Aug 1989
		Volume 5 : Issue 34

Today's Topics:
			 Conjugate Gradient Methods
			   non-iterative training
			      RCS...now what?
				  IJCNN89
			       Post-Doc in AI
		      Re: neural nets in manufacturing
			     Goles and Vichniac
		    Simulator software for PC or VAX???
		     Scaling Performance Data Requested
		   Alkon's SA article on NN: any papers?
	       Comments Requested : NNs in Stochastic Control
			 Questions about delta rule
		       Re: Questions about delta rule
		 References for the Broom Balancing Problem
	       Re: References for the Broom Balancing Problem
	       Re: References for the Broom Balancing Problem
			   COMPLEX SYSTEMS (1988)
			 COMPLEX SYSTEMS (Feb 1989)


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Conjugate Gradient Methods
From:    tedwards@cmsun.nrl.navy.mil (Thomas Edwards)
Date:    Fri, 14 Jul 89 10:53:22 -0400 


In my research into neural methods on a massively parallel SIMD machine, I
came across an article ("Efficient Parallel Learning Algorithms for Neural
Networks" Kramer, Sangiovanni-Vincentelli) which had a comparison of neural
techniques on the Connection Machine.

What first struck me was how much faster Conjugate Gradient methods reached
the proper answer in comparison to "vanilla" Back Propogation methods.  C-G
methods were often an order of magnitude faster than Steepest Descent, and
several orders of magnitude faster than Backprop for large learning
problems.

If we examine the strategy of Backprop vs. C-G, we can see why the C-G
methods are clearly superior (see _Numerical_Recipes_).  However, there
seems to be a dearth of articles which describe in detail how to set up
neural systems using C-G.

The article I mentioned discusses C-G implementation to a point.  It
describes how to obtain the proper direction of search based upon the
gradient and past search directions, but does not describe the line
minimization of the function in the search direction in detail.

Maybe I have just had bad luck with finding articles on this subject, but it
seems to me that if we had more Connectionists skilled in the C-G methods,
we would all benefit.  Is there, somewhere out there, an article with a rich
description of C-G Neural Network implementation?

 -Thomas Edwards
 tedwards@cmsun.nrl.navy.mil
 ins_atge@jhuvms.BITNET

------------------------------

Subject: non-iterative training
From:    Douglas G. Danforth <danforth@riacs.edu>
Date:    Fri, 14 Jul 89 08:43:51 -0700 



Dave Kemp writes ...
>...Is there any published work describing applications of non-linear algebra
>to NN training, and the types of problems to which it might be suited.

     The Learning Systems Division at the Research Institute for Advanced
Computer Science (RIACS), an Institute of the Universities Space Research
Association, is dedicated to applying the theory of Sparse Distributed
Memory (SDM) developed by Pentti Kanerva to a wide range of applications
(Speech, Vision, Robotics).  SDM is a 3-layer neural network with certain
constraints that make it unneccessary to use back-propogation for training
weights.  The constraints are that the input and output patterns are binary
vectors and that the connections strengths between the first and second
layers are fixed but randomly chosen.  The size of the patterns (number of
nodes in the first and third layers) is large, e.g. 256 to 1,000 bits.  The
size of the hidden layer is even larger, e.g. 8,000 and up.  When operating
under these conditions the NN behaves as a memory which can approximate
"table lookups" with generalization.  The standard SDM model modifies its
weights in a 1-step process, simply incrementing or decrementing the weights
between the second and third layers depending upon (1) whether a node in the
second layer has been activated and (2) whether a bit in the desired output
pattern is on or off.  See Kanerva, P. 1988, "Sparse Distributed Memory".
Cambridge, MA. MIT Press.

    I plan to present a paper (if I can get there!) at the '89 NIPS Denver
conference describing how the standard SDM model can be modified to yield
perfect recall.  Note, however, that when one does this the neural net
looses any ability to generalize.  There is an uncertainty principle for any
NN between memory capacity and the generalizability of the net.  The rule
for obtaining perfect recall can be written down analytically and, indeed,
is the solution to the inverse of a matrix.  The rule can be expressed as an
algorithm stating how to modify the weights in a 1-step process.  The
manuscript for this work is in progress.  I hope this helps.



	Douglas G. Danforth
	RIACS M/S 230-5
	NASA Ames Research Center
	Moffett Field, CA 94035
	danforth@riacs.edu


------------------------------

Subject: RCS...now what?
From:    ddickey@nic.MR.NET (Dan Dickey)
Date:    Fri, 14 Jul 89 14:50:46 -0500 

Ok, I retrieved RCS, it works great on my Sun.  I got it to compile and run
on my Cray.

Now what do I do?  Do any of you out there have sample/example code I could
run/test?  I'd like to learn about NN's, I've retrieved several of the back
issues of the Digest, and will be looking up the references.  I'm mainly
interested in applying NN's to language and AI work.  Is this feasible?  Are
there specific references in this area?

	-Dan A. Dickey		ddickey%ferrari.cray.com@gath.cray.com

------------------------------

Subject: IJCNN89
From:    dporcino@sol.UVic.ca
Date:    02 Aug 89 16:34:00 -0700 


A number of points struck me as indicating future research directions
and trends in neurocomputing:

 - Solutions to the local minima problem of back propagation.
    eg: introducing a noise component, ala simulated annealing
        
 - Solutions to the question of how many hidden units should be used.
    eg: start with two hidden units, backprop until error becomes less
        than some threshold, add another hidden unit, backprop, add,
        backprop, etc., until finally the last neuron offers no advantage.
        It's claimed that this method avoids local minimums.
        Ref is??? (It was a poster session, and I don't have my proceedings
        near by...)

 - Chaos came up repeatedly in any number of different sessions.
     "I think this is a chaotic limit cycle."
     "It looks like Freeman's olfactory models."
     "I think chaos actually CREATES new information."

 - In a similar vein, Fractals also came up.
     eg: Grossberg's masking fields have a self-similar architecture
         Pellionisz's neuron growth models

 - The tools being developed for these two fields will probably become
  more and more important as time goes on.

 - There was a much greater emphasis on practical applications, and some
  of the really amazing results are and will be those that replace more
  traditional control approaches and inverse kinematics.

 - More and more application of real neuroscience to models and applications:
     eg: Beer et al, and their walking cockroach, based on the hypothetical
         model that appeared in Scientific American, Dec.1976, Pearson (I
         think it was) "The Control of Walking."

 - The most impressive things to me were the feeling that "BackProp will
  solve anything, even though there might be a nicer way to solve a
  particular problem," and the pervasiveness of "chaos."

I wonder if Back Propagation and Hopfield nets might provide a new set of
Fundamentals for Sixth Generation Computing - sort of like Computer
Science's Turing Machines which can solve anything computable, but
not necessarily nicely.  This would point the way to some benchmarks, 
and some rigourous proofs about neural-computability...

 - Nick Porcino

p-mail: 2039 Goldsmith St. Victoria BC, Canada V8R 1T3

------------------------------

From:    COSCWKY@otago.ac.nz
Date:    Mon, 07 Aug 89 14:44:00 +0000 
Subject: Post-Doc in AI

Dear Sir,
         Would you be so kind as to include this message in the
     neuron-digest asappp (as soon as practically possible, please). 


         POST-DOCTORAL FELLOWSHIP in ARTIFICIAL INTELLIGENCE

         Applications are invited for a 2-year appointment as  a
     post-doctoral research fellow to work within the Artificial
     Intelligence   Group,   Department   of  Computer  Science,
     University of Otago, New Zealand.

         Candidates should preferably be experienced in  one  of
     the  areas  below,  but those who have a strong programming
     background and interest in AI research are also  encouraged
     to apply.

     (1) Vision  Research  based upon psychopyhsical findings -
         Current work is concerned with the design  of  suitable
         edge  detectors,  the computation of depth information,
         and the use of neural networks.

     (2) Large-scale  space  perception  - Current work is con-
         cerned with the development of a  computational  theory
         of cognitive maps and the design of planning algorithms
         using commonsense reasoning.

     (3) Expert systems - both practical and theoretical issues.
         Current  work  focuses  on  the  design  of intelligent
         tutoring  systems,  the   implementation   of   Baysian
         inference  network,  and  the construction of practical
         expert systems.

     (4) Childrens understanding of natural languages  -  A  new
         project  which  is  aimed at developing a computational
         theory to explain how infants  can  understand  natural
         language.

         Intending   applicants   should   write   for   further
     information from the Registrar, P.O. Box 56,  Dunedin,  New
     Zealand. Informal enquires may be directed to W.K. Yeap, AI
     Laboratory,  Department  of Computer Science, University of
     Otago,  P.O.  Box  56,  Dunedin,  New   Zealand. 
     (e-mail: Internet:Coscwky@otago.ac.nz)

         -----------------------------------------------
        Many thanks, Paul Naylor (Research Assistant to Dr.Yeap)
                     Computer Science Dept.
                     University of Otago,
                     Dunedin, N.Z.

------------------------------

Subject: Re: neural nets in manufacturing
From:    Rajit Gadh <rg1w+@andrew.cmu.edu>
Date:    Wed, 09 Aug 89 00:15:16 -0400 

Hi,

I am trying to make a list of people working in Neural Nets for
manufacturing, as that is my field of interest.  I would like you to put
this letter in the neuron digest, and anywhere else that you think it might
get read by researchers everywhere.

If anyone reading this letter is interested in getting their name on this
list, they are welcome to send me their name.  I would appreciate if they
included in the mail a topic, their univ/company/research lab/..., a
paragraph describing their research interests, and any thing they think
would be useful for compiling this list.  If they wish to enclose a detailed
research plan, that is also welcome.

After this list is compiled, I will send it to each person, and also post it
on the electronic news bulletin.

Rajit Gadh
gadh@me.ri.cmu.edu   OR  gadh@andrew.cmu.edu

------------------------------

Subject: Goles and Vichniac
From:    KSEPYML%TUDRVA.TUDELFT.NL@CUNYVM.CUNY.EDU
Date:    Fri, 11 Aug 89 16:47:00 +0000 

Hi,

I am urgently in need of an article in which Goles and Vichiniac showed that
the Marr-Poggio stereo algorithm can be cast in terms of an optimization
problem and in which they gave an expression for the objective function.
The reference to this article reads as follows:

E. Goles & G.Y. Vichniac, Proc. Neural Networks for Computing,
Snowbird UT, AIP Conf. Proc. 151, 165 (1986).

This article was referenced in an artice by Arthur F. Gmitro and Gene R.
Gindi which I found in the I.E.E.E conf. proceedings of the Int. Conf. on
Neural Networks, 1987. These authors wok at the Department of Diagnostic
Radiology, Yale University, New Haven.

To obtain this article through the usual way would take me several weeks,
which is just too long in this case.

Is there anybody out there who either can send me a copy of this article or
give me the E-mail adress of one of the persons mentioned above?

I would appreciate this very much!

Alexander G. van der Voort

Koninklijke Shell Exploratie en Produktie Laboratorium
Volmerlaan 6
2288 GD Rijwijk
The Netherlands

KSEPYML@HDETUD51.BITNET


------------------------------

Subject: Simulator software for PC or VAX???
From:    CLIFF%ATC@atc.bendix.com
Date:    Tue, 15 Aug 89 11:15:00 -0500 

Is anyone aware of public domain network simulators written in C
(particularly VAX or PC-based)?  We would prefer to avoid a major
duplication of effort.

Any responses will be summarized and posted.

Thanks in advance,

Pat Coleman  (pat@atc.bendix.com)

------------------------------

Subject: Scaling Performance Data Requested
From:    will@ida.org (Craig Will)
Date:    Thu, 17 Aug 89 18:52:52 -0400 


       Performance Data Requested on Network Scaling

For a review and technology assessment paper I am writing on
scaling  issues,  I  would  appreciate receiving pointers to
published or unpublished experimental data  on  the  scaling
behavior  of different neural network architectures for dif-
ferent problems.  That  is,  performance  such  as  training
time, probability of successful convergence, number of exam-
ples required to learn, performance  on  training  set,  and
generalization  ability,  as a function of the complexity of
the problem (and scale of network required to solve it).

I am interested in data for back propagation  and  recurrent
back  propagation  as  well  as  other  paradigms, including
Kohonen networks, adaptive resonance theory  networks,  Res-
tricted Coulomb Energy networks, etc.  Thank you.

Craig A. Will
Institute for Defense Analyses

will@ida.org (milnet)

Craig Will
IDA - CSED
1801 N. Beauregard Street
Alexandria, VA  22311

(703) 845-3522


------------------------------

Subject: Alkon's SA article on NN: any papers?
From:    hoss@ethz.UUCP (Guido Hoss)
Organization: ETH Zuerich
Date:    Sat, 15 Jul 89 07:50:42 +0000 

July's Scientific American features an article by D.L. Alkon on neural
systems in nature and computing. He presents the outline of a computer
neural network "derived from biological systems" which seems to perform
better than "nonbiological" neural networks using conventional algorithms.
Can anyone point me to additional literature and papers detailing the
concepts and algorithms of his implementation?

Please reply by e-mail; I will post a summary to the net.

Thanks
 -Guido Hoss

------------------------------

Subject: Comments Requested : NNs in Stochastic Control
From:    rakesh@loria.crin.fr (Rakesh Nagi)
Organization: CRIN, Nancy, France
Date:    Sun, 20 Aug 89 12:13:16 +0000 

I am working on the topic of Optimal Control of Manufacturing/Production
Systems subject to disturbances (machine failures, etc.). Having a pure
Mechanical Engineering background, I am fairly oblivious of the details
of Neural Nets. However, the little I have read about NNs, suggests its
applicability to Control of Continuous Processes (Chemical Plants, etc.).

I would highly appreciate your comments on the applicability of NNs to
the domain of Discrete and Stochastic Control; specifically the field
of Optimal Control of Manufacturing Systems (Planning, Scheduling, and
real-time control). References to the indicated topic will also be
appreciated. Thanks in advance.

Rakesh Nagi.

e-mail : rakesh@loria.crin.fr
	 rakesh%loria.crin.fr@FRCICB62.bitnet
(until 29th August)
Permanent e-mail :
	nagi@ra.src.umd.edu (ARPA net)

------------------------------

Subject: Questions about delta rule
From:    "Ajay M. Patrikar" <killer!pollux!amp@AMES.ARC.NASA.GOV>
Organization: Department of Electrical Engineering; S.M.U.; Dallas, TX
Date:    23 Jun 89 06:44:10 +0000 

I was trying to solve different problems mentioned in the following
reference :

D.E.Rumelhart, G.E. Hinton, and R.J.Williams, "Learning Internal
Representation by Error Propogation" in Parallel Distributed Processing :
Explorations in the microstructure of cognition. vol. 1, MIT press 1986

I used backpropogation algorithm for solving problems such as Exclusive OR ,
parity problem etc. In most cases I got different convergence rates from the
ones mentioned in the book.  Also, quite a few times the program ran into
local minima problem.

This may be because of altogather different initial conditions.  I was
generating random numbers in the interval (-0.5, 0.5) and using them as
initial weights and thresholds.

Has anyone on the net ran into similar problems ? I would appreciate if
someone can pass me information about

1. dependence of convergence rate on initial conditions 
2. What is the general criterion for convergence. 
3. performance of backpropogation on bigger problems.(no. of pattern
   presentations)

Thanking you in advance.

Ajay Patrikar

uunet!dalsqnt!pollux !amp

------------------------------

Subject: Re: Questions about delta rule
From:    kolen-j@toto.cis.ohio-state.edu (john kolen)
Organization: Ohio State University Computer and Information Science
Date:    Fri, 23 Jun 89 14:48:34 +0000 

>1. dependence of convergence rate on initial conditions 
>2. What is the general criterion for convergence. 
>3. performance of backpropogation on bigger problems.(no. of pattern
>   presentations)


We ran into these problems and several others (as most other
researchers have, but are reluctant to admit it).  Some answers to the
questions you pose appear in a recent Ohio State University Laboratory
for Artificial Intelligence Research technical report "Learning in
Parallel Distributed Processing Networks: Computational Complexity and
Information Content".  For ordering information contact

LAIR Technical Report Library
217 CAE
Dept. of Computer and Information Science
2036 Neil Avenue Mall
Columbus, Ohio  43210-1277



 -=-
John Kolen (kolen-j@cis.ohio-state.edu)|computer science - n. A field of study 
Computer & Info. Sci. Dept.	       |somewhere between numerology and
The Ohio State Univeristy	       |astrology, lacking the formalism of the
Columbus, Ohio	43210	(USA)	       |former and the popularity of the later.

------------------------------

Subject: References for the Broom Balancing Problem
From:    plonski@primrose.aero.org (Mike Plonski)
Organization: The Aerospace Corporation
Date:    Mon, 14 Aug 89 21:49:47 +0000 

the broom balancing (inverted pendulum) problem.  Any help would be
appreciated.
 -----------------------------------------------------------------------------
.   . .__.                             The opinions expressed herin are soley
|\./| !__!       Michael Plonski       those of the author and do not represent
|   | |         "plonski@aero.org"     those of The Aerospace Corporation.
_______________________________________________________________________________

------------------------------

Subject: Re: References for the Broom Balancing Problem
From:    Joseph Brady <brady@LOUIE.UDEL.EDU>
Organization: University of Delaware
Date:    15 Aug 89 12:09:39 +0000 

In article <55959@aerospace.AERO.ORG> plonski@primrose.aero.org 
(Mike Plonski) writes:
>the broom balancing (inverted pendulum) problem.  Any help ......

See the procceedings of this years INNS/IEEE Neural Net Conference,
held in June. Two or three papers on this problem.

Joe Brady

------------------------------

Subject: Re: References for the Broom Balancing Problem
From:    David E Demers <beowulf!demers@SDCSVAX.UCSD.EDU>
Organization: EE/CS Dept. U.C. San Diego
Date:    16 Aug 89 19:19:25 +0000 

In article <55959@aerospace.AERO.ORG> plonski@primrose.aero.org 
(Mike Plonski) writes:
>the broom balancing (inverted pendulum) problem.  Any help ......

Barto, Sutton & Anderson, "Neuronlike adaptive elements that can
solve difficult learning control problems" IEEE Trans. on
Systems, Man & Cybernetics 13 p. 834 (1983).  This article is
reprinted in James Anderson's  marvelous collection of classic
papers, Neurocomputing (MIT Press, 1988).

An interesting paper on the credit assignment problem!

Dave DeMers
demers@cs.ucsd.edu

------------------------------

Subject: COMPLEX SYSTEMS (1988)
From:    wli@uxh.cso.uiuc.edu
Date:    Sun, 16 Jul 89 09:19:00 +0000 



##########################################################################
Journal COMPLEX SYSTEMS  devotes to the rapid publication of research on 
the science, mathematics, and engineering of systems with simple components 
but complex overall behavior. 
##########################################################################


	COMPLEX SYSTEMS  (VOLUME 2, 1988)


- ---------------------------------------------------------------------------
Vol 2, Number 1 (February 1988)
- ---------------------------------------------------------------------------

Klaus Sutner:	
	On sigma-Automata

Luciano R. da Silva, Hans J. Herrmann, Liacir S. Lucena:
	Simulations of Mixtures of Two Boolean Cellular Automata Rule

Gerald Tesauro, Bob Jansens:
	Scaling Relationships in Back-propagation Learning

Hwa A. Lim:
	Lattice Gas Automata of Fluid Dynamics for Unsteady Flow

Carsten Peterson, James Anderson:
	Neural Networks and NP-complete Optimization Problems;
	A Performance Study on the Graph Bisection Problem

Charles H. Goldberg:
	Parity Filter Automata


- ---------------------------------------------------------------------------
Vol 2, Number 2 (April 1988)
- ---------------------------------------------------------------------------

Michel Cosnard, Driss Moumida, Eric Goles, Thierry de St. Pierre:
	Dynamical Behavior of a Neural Automaton with Memory

Karel Culik II, Sheng Yu:
	Undecidability of CA Classification Schemes

Armin Haken, Michael Luby:
	Steepest Descent Can Take Exponential Time for Symmetric
	Connection Networks

Gerhard Grossing, Anton Zeilinger:
	Quantum Cellular Automata

Andre Barbe:
	Periodic Patterns in the Binary Difference Field

Carter Bays:
	Classification of Semitotalistic Cellular Automata
	in Three Dimensions

- ---------------------------------------------------------------------------
Vol 2, Number 3 (June 1988)
- ---------------------------------------------------------------------------

Carter Bays:
	A Note on the Discovery of a New Game of Three-dimensional Life

Hudong Chen, Shiyi Chen, Gary Doolen, Y.C. Lee:
	Simple Lattice Gas Models for Waves

Domenico Zambella, Peter Grassberger:
	Complexity of Forecasting in a Class of Simple Models

Steven Nowlan:
	Gain Variation in Recurrent Error Propagation Networks

D.S. Broomhead, David Lowe:
	Multivariable Functional Interpolation and Adaptive Networks

John Milnor:
	On the Entropy Geometry of Cellular Automata

	
- ---------------------------------------------------------------------------
Vol 2, Number 4 (August 1988)
- ---------------------------------------------------------------------------

Werner Krauth, Marc Mezard, Jean-Pierre Nadal:
	Basin of Attraction in a Perceptron-like Neural Network

Kristian Lindgren, Mats G. Nordahl:
	Complexity Measures and Cellular Automata

Jacek M. Kowalski, Ali Ansari, Paul S. Prueitt, Robert L. Dawes, Gunther Gross
	On Synchronization and Phase Locking in Strongly Coupled
	Systems of Planar Rotators

Ronald Rosenfeld, David S. Touretzky
	Coarse-Coded Symbol Memories and Their Properties

Avidan U. Neumann, Bernard Derrida, Gerard Weisbuch
	Domains and Distances in Magnetic Systems


- ---------------------------------------------------------------------------
Vol 2, Number 5 (October 1988)
- ---------------------------------------------------------------------------

Eric Goles, Andrew M. Odlyzko
	Decreasing Energy Functions and Lengths of Transients for Some
	Cellular Automata

James A. Reggia, Patric M. Marsland, Rita Sloan Berndt
	Competitive Dynamics in a Dual-route Connectionist Model of
	Print-to-sound Transformation

Lyman P. Hurd
	The Non-wandering Set of a CA Map

Tal Grossman, Ronny Meir, Eytan Domany
	Learning by Choice of Internal Representations

Berengere Dubrulle
	Method of Computation of the Reynolds Number for Two Models
	of Lattice Gas Involving Violation of Semi-detailed Balance

Gerhard Grossing, Anton Zeilinger
	Quantum Cellular Automata: A Corrigendum
	

- ---------------------------------------------------------------------------
Vol 2, Number 6 (December 1988)
- ---------------------------------------------------------------------------

Sara A. Solla, Esther Levein, Michael Fleisher
	Accelerated Learning in Layered Neural Networks

Jonathan Engel
	Teaching Feed-Forward Neural Networks by Simulated Annealing

Klaus Sutner
	Additive Automata on Graphs

David M. Chess
	Simulating the Evolution of Behavior: the Iterated Prisoners'
	Dilemma Problem

Frank J. Smieja, Gareth D. Richards
	Hard Learning the Easy Way: Backpropagation with Deformation

Stewart Wilson
	Bid Competition and Specificity Reconsidered


&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
	For more information on COMPLEX SYSTEMS, send mail to

Complex Systems Publications, Inc.
P.O.Box 6149				jcs@complex.ccsr.uiuc.edu (Arpanet)
Champaign, IL 61821-8149 USA		jcs%complex@uiucuxc.bitnet(Bitnet)
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&

------------------------------

Subject: COMPLEX SYSTEMS (Feb 1989)
From:    wli@uxh.cso.uiuc.edu
Date:    Sun, 16 Jul 89 09:19:00 +0000 



##########################################################################
Journal COMPLEX SYSTEMS  devotes to the rapid publication of research on
the science, mathematics, and engineering of systems with simple components
but complex overall behavior.
##########################################################################


	COMPLEX SYSTEMS  

- ---------------------------------------------------------------------------
Vol 3, Number 1 (February 1989)
- ---------------------------------------------------------------------------

Phillippe Binder
	Abnormal Diffusion in Wind-tree Lattice Gases

Henrik Bohr, Soren Brunak
	A Traveling Salesman Approach to Protein Conformation

Stan Franklin, Max Garzon
	Global Dynamics in Neural Networks

Giorgio Mantica, Alan Sloan
	Chaotic Optimization and the Construction of Fractals:
	Solution of an Inverse Problem

Mats G. Nordahl
	Formal Languages and Finite Cellular Automata

Rudy Rucker
	Symbiotic Programming: Crossbreeding Cellular Automaton
	Rules on the CAM-6

Eduardo D. Sontag, Hector J. Sussmann
	Backpropagation Can Give Rise to Spurious Local Minima for
	Networks without Hidden Layers

Klaus Sutner
	A Note on the Culik-Yu Classes


&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
For more information on the journal COMPLEX SYSTEMS, send mail to
 
Complex Systems Publications, Inc.
P.O.Box 6149                            jcs@complex.ccsr.uiuc.edu (Arpanet) 
Champaign, IL 61821-8149 USA            jcs%complex@uiucuxc.bitnet(Bitnet)
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&

------------------------------

End of Neurons Digest
*********************