[comp.theory.cell-automata] cellular automata and neural networks

baker2@husc9.harvard.edu (James Baker) (04/12/90)

Has anyone constructed ``cellular automata'' that learn?

There seems to be some good reasons to explore this possibility:

1.  Arbitrary connections make analyzing neural networks difficult, if
not just impossible.

2.  Cellular automata models are more readily simulated on hyper cube
parallel architectures than conventional neural networks.

Since one would train these models, they would not be cellular automata
in the strict sense; for example, they might use some global reward
signal or noise, in addition to receiving input and target data.

-- Jim

tenorio@EE.ECN.PURDUE.EDU (Manoel Fernando Tenorio) (04/13/90)

   From:  baker2@husc6.harvard.edu  (James Baker)
   Subject:  cellular automata and neural networks

   
   Has anyone constructed ``cellular automata'' that learn?

I think that before we can answer this question, it should be proceeded by a 
definition of what is and is not a CA. Certainly, learning has been done and
is possible on CA-like machines.

   There seems to be some good reasons to explore this possibility:

   1.  Arbitrary connections make analyzing neural networks difficult, if
   not just impossible.

I don't understand why that is so. We have published and algorithm to do that
in the IEEE Trans in NN. This is how the brain seems to form certain 
subcircuits, specially theone that are experienced based.

   2.  Cellular automata models are more readily simulated on hyper cube
   parallel architectures than conventional neural networks.

I don't understand why that is so. When I think of CA's, I have images of
highly regular structures, but that might not be necessary; but again the
same can be said about NN.

   Since one would train these models, they would not be cellular automata
   in the strict sense; for example, they might use some global reward
   signal or noise, in addition to receiving input and target data.

There have been a number of works on "local" learning rules for NN that can be 
applied here. You could take a look at CA-like machines in NIPS'87: 
Nondeterministic Adaptive Logic ELements by Windecker.

   -- Jim

------- End of Forwarded Message

holsz@cadence.com (Wlodzimierz Holsztynski) (04/13/90)

In article <BAKER2.90Apr11140207@husc9.harvard.edu> baker2@husc9.harvard.edu (James Baker) writes:
>...	...	...
>
>2.  Cellular automata models are more readily simulated on hyper cube
>parallel architectures than conventional neural networks.
>
>...	...	...
>-- Jim

Being  "Mr. GAPP"  let me inform you that my invention provides
you with a much more efficient way.  The GAPP chips are available
from  NCR (Fort Collins). They used the old 3 micron technology.
Nevertheless you get 72 cells (a 6 by 12 array) on one chip.
Each cell has its own  128  bits of memory (on the chip).
You might even get  GAPP boards  from Amber Eng. (Santa Barbara,
California).  NCR used to make some toy boards, possibly still does.

I hear that newer variations of my invention are produced (with
an 8 by 16 array) but I don't think they are commercially
available. 

GAPP  stands for  Geometric-Arithmetic Parallel Processor.
I made up this name to emphasize it's two strengths.
It is used in military applications. But besides being practical,
it has pure architecture. It's instruction set forms a simple
language perfect for theoretical research, like for studies in
the computational complexity. (I got Conway's Life in 25 GAPP
instructions and wonder if anybody can lower it. Independently,
about a year after I got my program they independently had
a competition at NCR but got only 30++? result).

-- Wlodek

park@usceast.UUCP (Kihong Park) (04/15/90)

In article <BAKER2.90Apr11140207@husc9.harvard.edu> baker2@husc9.harvard.edu (James Baker) writes:
>Has anyone constructed ``cellular automata'' that learn?

You have to remember that certain types of neural networks can be suitably
represented as cellular automata. It is advantageous that the NN be discrete,
and have nontotal connectivity. But these are not absolute requirements.
You will need to have a cell type which encodes the weights, and another
cell type which encodes the neuron. Again, it's not necessary to view as
there being two cell types because you can always merge them as one ---
a standard trick. If your neurons are modeled to have continuous transfer
functions, this is remedied by incorporating a table of the functional
values with as accurate finite quantization level as one pleases.

>There seems to be some good reasons to explore this possibility:
>
>1.  Arbitrary connections make analyzing neural networks difficult, if
>not just impossible.

I don't quite understand your statement. Even though CAs are "simpler"
systems than NNs, it is very often the case that the latter are analytically
more tractable. That's why you have all this hype surrounding NNs. People
feel that they understand their neural nets to a certain degree. It makes
them semi-confident design/engineers.

>2.  Cellular automata models are more readily simulated on hyper cube
>parallel architectures than conventional neural networks.

Since, as you point out, the CA's neighborhood is often times quite
limited, CAs can be more easily simulated on parallel machines such as
hypercubes. But in many implementations of NNs on multi-processor
machines, in effect, a morphism to a cellular automata structure is
performed anyway. So, even though there exist BIG differences between CAs
and NNs, and they are studied in different contexts, one has to always
remeber that they are close cousins.

>Since one would train these models, they would not be cellular automata
>in the strict sense; for example, they might use some global reward
>signal or noise, in addition to receiving input and target data.

There is a field called systolic arrays which deals with practical design
issues in implementing "easy"-to-parallelize algorithms on CA-like
environments. There exist terminologies such as "global broadcasting"
which would suggest terms you may be looking for.

Kihong Park. (park@cs.scarolina.edu)