[comp.ai.neural-nets] Parsimonious error metric

al@gtx.com (Alan Filipski) (09/29/88)

At the ICNN earlier this year in San Diego, Rumelhart gave a talk in
which he discussed the brilliant idea of incorporating a measure of the
size/complexity of a net into the error criterion being minimized.  The
back-prop procedure would thus tend to seek out smaller net
configurations as well as more accurate ones.

I thought this was the most memorable talk of the conference.
Unfortunately, I did not copy down his formulas for the updating rules
under this criterion.  I thought I could look them up later-- but alas,
they do not seem to be in the proceedings.  Can anyone give a reference
to a paper that covers this technique and discusses the results of
his experiments?


  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 ( Alan Filipski, GTX Corp, 8836 N. 23rd Avenue, Phoenix, Arizona 85021, USA )
 ( {allegra,decvax,hplabs,amdahl,nsc}!sun!sunburn!gtx!al       (602)870-1696 )
  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~