[comp.ai.neural-nets] Minimum acuracy for BP

bt@ap.co.umist.ac.uk (Blaise Thauvin) (03/01/91)

A few weeks ago, somebody wrote about analog VLSI implementing back
propagation that at least 16 bits were needed for this algorithm to perform
well.

As a postgraduate student, my research project is based on the use of non
floating_point variables in a neural network. Thus I am very interested in any
idea or article concerning the minimum acuracy needed to implement an
algorithm, and how it has been determined, as well as anything related to that
subject.

I'll mail a sumary of the answers to anybody interested.

Thanks

Blaise Thauvin   E-mail bt@ap.co.umist.ac.uk
                 UMIST University of Manchester