[net.arch] Computing with Neural Circuits:

hes@ecsvax.UUCP (Henry Schaffer) (08/05/86)

A paper, "Computing with Neural Circuits: A Model" by John J.
Hopfield and David W. Tank is in the 8 Aug. 1986 issue of
Science (pp. 625-633.)

"A new conceptual framework and a minimization principle together
provide an understanding of computation in model neural circuits.
The circuits consist of nonlinear graded-response model neurons
organized into networks with effectively symmetric synaptic
connections.  The neurons represent an approximation to biological
neurons in which a simplified set of important computational properties
is retained.  Complex circuits solving problems similar to those
essential in biology can be analyzed and understood without the need
to follow the circuit dynamics in detail.  Implementation of the model
with electronic devices will provide a class of electronic circuits of
novel form and function."  (Abstract)

brian@prism.UUCP (08/06/86)

Perhaps I'm being overly simplistic, but doesn't this neural network
stuff seem similar to ANALOG networks of non-linear devices?

Could such a network be modeled with constraint propagation with non-linear
functions and relationships? 

Where is a good place to find out more about this stuff?

brian moran

----
brian  	{mit-eddie, ihnp4!inmet, wjh12, cca, datacube}!mirror!brian
Mirror Systems	2067 Massachusetts Avenue  Cambridge, MA, 02140
Telephone:	617-661-0777 extension 141
(((((((( * ))))))))
---

mangler@cit-vax.Caltech.Edu (System Mangler) (08/13/86)

In article <64300001@prism>, brian@prism.UUCP writes:
> Perhaps I'm being overly simplistic, but doesn't this neural network
> stuff seem similar to ANALOG networks of non-linear devices?

(Being even more simplistic)  As near as I can tell, neural network
circuits are made of threshold logic.  When a 'neuron' has more
excitory inputs true than inhibitory inputs, it fires.	That's
threshold logic.  The stuff is described in all the old digital
switching theory books (such as Kohavi) but nobody seems to have
used it until Hopfield.  Doesn't look very analog to me.

Don Speck   speck@vlsi.caltech.edu  seismo!cit-vax!speck

lyang@sun.uucp (Larry Yang) (08/13/86)

In article <894@cit-vax.Caltech.Edu> mangler@cit-vax.Caltech.Edu (System Mangler) writes:
>In article <64300001@prism>, brian@prism.UUCP writes:
>> Perhaps I'm being overly simplistic, but doesn't this neural network
>> stuff seem similar to ANALOG networks of non-linear devices?
>
>(Being even more simplistic)  As near as I can tell, neural network
>circuits are made of threshold logic.  When a 'neuron' has more
>excitory inputs true than inhibitory inputs, it fires.	That's
>threshold logic.  The stuff is described in all the old digital
>switching theory books (such as Kohavi) but nobody seems to have
>used it until Hopfield.  Doesn't look very analog to me.

[Being a little less simplistic]

A neuron receives neural input at its dendrites, the receiving end of 
the nerve cell.  Although the incoming excitatory inputs are action potentials,
sort of on-off signals propagating in from the previous nerve's axon, the
signals in the dendrites are actually "graded" (i.e., analog).  It is roughly
safe to say that the faster the action potentials come in, the greater this
graded signal is. (There are probably exceptions to this; there are exceptions
to every rule in biology.) How this dendrite signal behaves is 
modulated by the inhibitory inputs on the dendrites; these
inputs tend to be analog; that is, the greater the inhibit signal, the more
the dendrite signal is modified.  The resulting signal moves through
the cell body and starts down the axon.  If the signal is above a certain
threshold, the axon fires an action potential (spike) and the signal goes
on down the axon to impinge onto the next cell.  If the signal is too weak, it
just dies out (cell walls are VERY resistive).  This threshold can also
be modified by many things; chemical (e.g., drugs) and electrical.

In other parts of the nervous system, analog information is more prevelant.
In the optical system, signals coming off of the retina are in the form
of graded potentials.  Information regarding shapes and motion of objects is 
encoded by different _rates_ of firing of action potentials in nerves, as 
found by Hubel and Wiesel.  In the auditory system, volume is also 
encoded by rates of firing of nerves.

Thus, although signals appear to be digital because of the on-off nature
of the action potentials, it is the amount of these signals (in the
form of both more excitatory inputs and higher rate of excitation) that is the
important factor.  Therefore,  it is not proper to say that the nervous system
is analog or digital, but a happy mixture of both.



-- 


-- Larry Yang
   Sun Microsystems, Inc.
   Mountain View, CA  94043

eugene@ames.UUCP (Eugene Miya) (08/14/86)

While I am a bit skeptical about Neural nets, you should be aware that
Hopfield has a paper in the latest issue of Science:

%A John J. Hopfield
%A David W. Tank
%T Computing with Neural Circuits: A Model
%J Science
%V 233
%N 4764
%D 8 August 1986
%K Neural nets,
%X Low-level survey paper.

Optical fans should also note a new survey paper in IEEE Spectrum.
[A bit disappointing as it is less technical.]

From the Rock of Ages Home for Retired Hackers:
--eugene miya
  NASA Ames Research Center
  com'on do you trust Reply commands with all these different mailers?
  {hplabs,ihnp4,dual,hao,decwrl,tektronix,allegra}!ames!aurora!eugene
  eugene@ames-aurora.ARPA

jhv@houxu.UUCP (James Van Ornum) (08/14/86)

Not all neural computing research is based on threshold logic,
some of indeed is analog - operational amplifiers with a large
number of inputs to the summing node, and a large number of
interconnected amplifiers.

Min-Micro Systems, August 1986, page 43 and
Business Week, June 2, 1986, page 92
are two general articles on this subject.
-----------------------
	James Van Ornum, AT&T (Information Systems), houxu!jhv