[comp.ai.neural-nets] Request: Effects of Different Weight Updating Frequencies

jufier@daimi.aau.dk (Svend Jules Fjerdingstad) (12/05/90)

	In connection with parallel implementations of neural networks
it is often desirable to present several input patterns before a weight
update is performed. However, apparently such infrequent weight updates
may sometimes influence the rate of convergence and the ability to
generalize.

	If you know of any articles dealing with this subject, please
mail references. We are especially interested in hearing about results
concerning feed-forward networks.

	Apparently, the following article deals with the problems of
infrequent weight updates. Do you know how we may get a copy of this
article, or, alternatively, do you know the email address of any of
the authors. Please mail responses.

- Haffner P., Waibel A., Sawai H., & Shikano K. (1988):
Fast back-propagation learning methods for neural networks in speech.
(Tech. Rep. No. TR-1-0058) Osaka, Japan: ATR Interpreting Telephony
Research Laboratories.

	Thanks a lot.

	Svend.

--
Svend Jules Fjerdingstad, jufier@daimi.aau.dk       |  "To love,
Computer Science Department, University of Aarhus   |     and to learn."
Ny Munkegade 116, DK-8000 Aarhus C, DENMARK         |