[comp.ai.digest] AIList V5 #244 Neuromorphic Terminology

MINSKY@OZ.AI.MIT.EDU (10/24/87)

Terms like "neural networks" were in general use in the 1940's.  To
see its various forms I suggest looking through the Bulletin of
Mathematical Biophysics in those years.  For example, there is a 1943
paper by Landahl, McCulloch and Pitts called "A statistical
consequence of the logical calculus of Nervous Nets" and a 1945 paper
by McCulloch and Pitts called "A heterarchy of values determined by
the topology of Nervous Nets.  It is true that Papert and I confused
this with the title of another McCulloch Pitts 1943 paper, which used
the term "nervous activity" instead.  Both papers were published
together in the same journal issue.  In any case, "neural networks"
and "nervous nets" were already in the current jargon.

In the original of my 1954 thesis, I called them "Neural-Analog
Networks, evidently being a little cautious.  But in the same year I
retitled it for publication (for University Microfilms) as "Neural
Nets and the Brain Model Problem".  My own copy has "Neural Netorks
and the ..." printed on its cover.  My recollection is that we all
called them, simply, "neural nets".  A paper of Leo Verbeek has
"neuronal nets" in its title; a paper of Grey Walter used "Networks of
Neurons"; Ashby had a 1950 paper about "randomly assembled nerve
networks.  Farley and Clark wrote about "networks of neuron-like
elements".  S.C.Kleene's great 1956 paper on regular expressions was
entitled "Representation of events in Nerve Nets and Finite Automata".

Should we continue to use the term?  As Korzybski said, the map is not
the world.  When a neurologist invents a theory of how brains learn,
and calls THAT a neural network, and complains that other theories are
not entitled to use that word, well, there is a problem.  For even a
"correct" theory would apply only to some certain type of neural
network.  Probably we shall eventually find that there are many
different kinds of biological neurons.  Some of them, no doubt, will
behave functionally very much like AND gates and OR gates; others will
behave like McCulloch-Pitts linear threshold units; yet others will
work very much like Rosenblatt's simplest perceptrons; others will
participate in various other forms of back-propagated reinforcement,
e.g., Hebb synapses; and so forth.  In any case we need a generic term
for all this.  One might prefer one like "connectionist network" that
does not appear to assert that we know the final truth about neurons.
But I don't see that as an emergency, and "connectionist" seems too
cumbersome.  (Incidentally, we used to call them "connectionistic" -
and that has condensed to "connectionist" for short.)