[comp.ai.neural-nets] Biological Reality and Backpropagation

adverb@bucsb.UUCP (Josh Krieger) (05/05/89)

Does anybody know of ANY biological evidence that supports the
existence of backprop. NETtalk produced a nice demo, but the
claims of babling being the same as the babbling of small
children apparently aren't consistent. In other words, it's
babbling (that's the way we hear it), but children babble
differently. If there is any solid evidence I would like
to find out about it.

-- Josh Krieger (adverb%bucsx.BU.EDU@bu-it.bu.edu)

andrew@berlioz (Lord Snooty @ The Giant Poisoned Electric Head) (05/05/89)

In article <2518@bucsb.UUCP>, adverb@bucsb.UUCP (Josh Krieger) writes:
> Does anybody know of ANY biological evidence that supports the
> existence of backprop..

On the contrary, Stephen Grossberg ("Neural Networks and Natural
Intelligence") takes the opposed view - that weight transport, required
for BP, is a physically implausible mechanism. We had the BP vs. ART
discussion some months ago in this group; I received some interesting
responses.

The only "pro-bio-BP" argument I have heard addresses upper-level
symbolic processing levels, but it seems to me that Grossberg's
argument holds anywhere in the brain.
-- 
Andrew Palfreyman 		USENET: ...{this biomass}!nsc!logic!andrew
National Semiconductor M/S D3969, 2900 Semiconductor Dr., PO Box 58090,
Santa Clara, CA 95052-8090 ; 408-721-4788 		there's many a slip
							'twixt cup and lip

neves@ai.cs.wisc.edu (David M. Neves) (05/08/89)

In article <180@bach.nsc.com> andrew@berlioz (Lord Snooty @ The Giant Poisoned Electric Head) writes:
>On the contrary, Stephen Grossberg ("Neural Networks and Natural
>Intelligence") takes the opposed view - that weight transport, required
>for BP, is a physically implausible mechanism.
What weight transport is required by back propagation?  I really see
no reason to transfer weights anywhere.  If he is talking about
computation and storage and movement within the neuron I didn't
realize we were so knowledgeable about neurons that we could already
rule out large classes of models.

;David Neves, Computer Sciences Department, University of Wisconsin-Madison
;Usenet:  {rutgers,ucbvax}!uwvax!neves
;Arpanet: neves@cs.wisc.edu

andrew@berlioz (Lord Snooty @ The Giant Poisoned Electric Head) (05/09/89)

In article <7487@spool.cs.wisc.edu>, neves@ai.cs.wisc.edu (David M. Neves) writes:
> In article <180@bach.nsc.com> andrew@berlioz (Lord Snooty @ The Giant Poisoned Electric Head) writes:
> >On the contrary, Stephen Grossberg ("Neural Networks and Natural
> >Intelligence") takes the opposed view - that weight transport, required
> >for BP, is a physically implausible mechanism.
> What weight transport is required by back propagation?  I really see
> no reason to transfer weights anywhere.  If he is talking about
> computation and storage and movement within the neuron I didn't
> realize we were so knowledgeable about neurons that we could already
> rule out large classes of models.

Well David, I will quote without permission from the tome in that case.
From section 17 in the chapter "Competitive Learning", p235 et seq.:

   "Comparing Adaptive Resonance and Back Propagation Models
   [..an enumeration of characteristics favourable to ART, unfavourable to BP..]
   C. Weight Transport or Top-Down Template Learning
   In both a BP model and an ART model, both bottom-up and top-down LTM traces
   exist. In a BP model (see Fig) the top-down traces in F4->F5 pathways are
   formal transports of the learned F2->F3 traces. In an ART model (see Fig)
   the top-down traces in F2->F1 pathways are directly learned by a realtime
   associative process. These top-down LTM weights are not transports of the
   learned LTM traces in the F1->F2 pathways, and they need not equal these
   bottom-up LTM traces. Thus an ART model is designed so that both bottom-up
   and top-down learning are part of a single information processing 
   hierarchy, which can be realised by a locally computable realtime process."

The ART Fig. is well-known, and the BP Fig. is as follows:

expected outputs	         --------------------  +
       | +         --------------| differentiator F6 |<------------
       \/          |             ---------------------        ----|-------
----------------- +|                                          | actual    |
| error signals |<-|                                          | outputs F3|
|	F4      |<--------------------------------------------|------------
--------------|-| -			learning signal		   /\
	|     |-------------------------------------------------->>|
	|		(weight transport)			   |
	|<---------------------------------------------------------|
	|							   |
	|			----------------- +		-----------
	|		--------| differentiator|<--------------| hidden   |
	|		|	|	F7	|		| units    |
	\/		|	-----------------		|    F2    |
 --------------- +	| 					------------
| error signals |<------      learning signal			   /\
|    F5		|------------------------------------------------>>|
-----------------						   |
								----------
								| inputs |
								|   F1   |
								----------
"Circuit diagram of BP model: in addition to the processing levels F1, F2,
F3, there are also levels F4, F5, F6 and F7 to carry out the computations
which control the learning process. The transport of learned weights from
the F2->F3 pathways to the F4->F5 pathways shows that this algorithm
cannot represent a learning process in the brain".
-- 
Andrew Palfreyman 		USENET: ...{this biomass}!nsc!logic!andrew
National Semiconductor M/S D3969, 2900 Semiconductor Dr., PO Box 58090,
Santa Clara, CA 95052-8090 ; 408-721-4788 		there's many a slip
							'twixt cup and lip