tgowrish@gmu90x.UUCP (T Gowrishankar) (06/28/89)
This posting is for a friend of mine and hence please reply to the
e-mail addres given below, thank you.
I would like to get some info. on the numerical app. algorithm :
Consider the Correlational Matrix Memory by Kohonen
Input Weight Matrix Output
(1,2) w0 w1 (3,4)
(5,6) w2 w3 (7,8)
The weight matrix is a 2*2 matrix that has to relate (1,2) against
(3,4) and relate (5,6) against (7,8). In this example it is possible
to solve for w0,w1,w2,w3 as a set of linear equations and the solution
is w0 = -1; w1 = 2; w2 = -2; w3 = 3.
Thus the matrix has a memory of weights to map two 2 element vectors
against two 2 element vectors. However if the dimensions are such
No of Variables greater than No of equations;
you are not able to solve as a set of equations then we have to
use some numerical algorithms. Kohonen's corelational matrix memory is
one such algorithm of numerical approximation. I checked this algorithm
and this happenned to perform very poorly.
Does anyone know whether this algorithm works properly to determine
the weight matrix that maps input vectors against output vectors ?
References:
Kohonen, T. Associative Memory, Springer, 1978.
The algorithm is as follows:
Initialize W with zeros: fix a learning parameter eeta, and a
threshold value as tou.
Repeat
For each k from 1 to n do
k kT
set /\W = eeta(y - W x ) x
-- k
set W = W + /\ W
--
Until ||/\ W || < tou
--
Thus Kohonen says that one can get a weight matrix that is actually
an associative memory. I ran this on the VAX8530 supporting UNIX
and it either is very approximate or it takes an infinite time
to converge when you set the threshold tou as pretty small.
If anyone happens to know about this algorithm please send me mail
My e-mail address is pmurali@gmuvax.gmu.edu
or pmurali@gmuvax2.gmu.edu
My phone no is (703) 591-7515 (H)
(301) 286-4073 (W)
Thanks,
Murali