[comp.ai.neural-nets] trainability of neural nets

jwang@cwsys2.cwru.Edu (11/18/88)

I am currently working on research of theory and methodology of
artificial neural net in a general setting from system point of view. 
My approach to the problem is by formalization, categorization and
caracterization.  I hope a complete theory and methodology on neural
system as a means of deriving decision rules can be developed based on
in-depth analysis and synthesis.  I got some elementary results in this
direction. 
I am very interested in trainability of neural nets. The following
is my definition of trainability from a working paper under preparation.
It is a Tex file (I made some modification), I hope it is readable.
 **********************************************************************
     Definition 3.10 (Trainability): Given architecture and propagation
rule, and learning rule, an artificial neural net (ANN) is trainable if
and only if a set of definite parameters $w$ can be obtained, precisely,
an ANN is trainable iff

 \forall \epsilon > 0, \exists T>0, \exists w(T)\in  W, if t>=
 T  || w(t+\delta t) - w(t)|| <= \epsilon

     An ANN is globally trainable if it is trainable under arbitrary
initial conditions.  An ANN is globally and absolutely trainable if it
is globally trainable at optimum parameters with respect to given E(w),
i.e.  \min_{w\in W} E(w(t))=\sum_{p=1}^P ||t^p - o^p(w(t))||_p=0, or
\lim_{t\to\infty}E(w(t))=0. 
*************************************************************************

If anybody has some comments or suggestions on this property of neural
nets, or knows someone has been working on this, please tell me via
E-mail me or postal mail. Thanks.
 
              Jun Wang
              Dept. of Systems Engg.
              Case Western Reserve Univ.
              Cleveland, Ohio  44106
              jwang@cwsys2.cwru.edu
              jwang@cwcais.cwru.edu