[comp.ai.neural-nets] NNs for eigenspaces

chambers@watdcsu.waterloo.edu (Mike Chambers - Independent Studies) (09/29/89)

I recently posted a note asking for references on the use of NNs to solve
eigensystems, many thanks for the responses. For my purposes, finding the
eigenvalues/vectors for a real symmetric matrix, these were a great help.
 
thanks again,
 
Mike Chambers
-------------------------- references follow ---------------------------------

                           

From: gary@desi.ucsd.edu (Gary Cottrell)

I have found that a linear encoder network
(e.g., 64x16x64, use a low learning rate of
about .01 with back prop) will converge on a 
solution that spans the principal subspace, 
even if it doesn't line up on the principal 
components. See also Sanger's article in NIPS
88 (the book is 89), and Baldi's article in the
same volume, as well as Linsker's for an alternative
(ed Touretzy, Morgan Kaufmann).


                             ************* 


From: Mark Plumbley <mdp%digsys.engineering.cambridge.ac.uk@NSFNET-RELAY.AC.UK>

R Linsker "Self-Organization in a Perceptual Network",IEEE Computer 3(21)
105-117, March 1988

E Oja, "A Simplified Neuron Model as a Principal Component Analyser".
Journal of Mathematical Biology 15, 267-273, 1982.

E Oja and Juha Karhunen "On Stochastic Approximation of the Eigenvectors
and Eigenvalues of the Expectation of a Random Matrix".  Journal of
Mathematical Analysis and Applications 106, 69-84, 1985.

M D Plumbley and F Fallside "An Information-Theoretic Approach to
Unsupervised Connectionist Models".  In Proceedings of the 1988
Connectionist Models Summer School (ed.  D.  Touretzky, G.  Hinton and T.
Sejnowski), 239-245, Morgan-Kaufmann, San Mateo, CA., 1988.


                          **************

From: Peter Foldiak <PF103%phoenix.cambridge.ac.uk@NSFNET-RELAY.AC.UK>

Peter Foldiak, "Adaptive network for optimal linear feature extraction",
Proceedings of the 1989 Internatiuonal Joint Conference on Neural Networks,
Vol 1, pp 401-405.