[comp.ai.neural-nets] Back Propagation Network built fully with Analog Circuits

arun@vlsi.waterloo.edu (Arun Achyuthan) (02/12/91)

B. Furman and A. A. Abidi of Integrated Circuits and Systems Laboratory
of University of California, Los Angles report in the paper titled "An Analog
CMOS Backward Error-Propagation LSI" (Proceedings of the 22nd Asilomar
Conf. on Signals, Systems and Computers, pp645-648) a fully analog architecture
to implement the Back Prop algorithm, which includes on-chip learning.
All my simulations to date indicate that the weight change computed in
the backward path needs at least 16 bits of resolution and highly linear 
processing of data, inorder for the algorithm to converge. My knowledge is that 
both these requirements are very diffcult to be met by analog circuits, due to 
their inherent properties.

Could the authors of the above mentioned paper or anybody else in this newsgroup
provide an explanation to this conflicting observation?

Arun