[comp.sys.handhelds] matrix manipulations

YEE@rcgl1.eng.ohio-state.edu (Roger Yee) (11/10/90)

Hi there,

I just got a 48sx and in a class we do alot of matrix manipulations.  The
two I'm primarily interested in is eiganvalues and eiganvectors of a matrix.
I can't seem to find a book on a method of doing this and I hate going to
campus just to do homework when my calculator probably can do this.

Can anyone help?  I know that Bill Wickes did something like this for the
28 has he done it for the 48sx?

Thanks in advance,

Roger Yee (YEE@RCGL1.ENG.OHIO-STATE.EDU)

jc@atcmp.nl (Jan Christiaan van Winkel) (11/11/90)

From article <6132@quanta.eng.ohio-state.edu>, by YEE@rcgl1.eng.ohio-state.edu (Roger Yee):
> Hi there,
> 
> I just got a 48sx and in a class we do alot of matrix manipulations.  The
> two I'm primarily interested in is eiganvalues and eiganvectors of a matrix.

Sorry for correcting you - the name is eigenvalue and eigenvector. the word 
'eigen' stems from dutch and means something like 'own' or 'self'. I am proud
that at least a few words from my not so widespread language (about 20M people 
speak it) make it in international disciplins! :-)

Happy computing!
JC
-- 
___  __  ____________________________________________________________________
   |/  \   Jan Christiaan van Winkel      Tel: +31 80 566880  jc@atcmp.nl
   |       AT Computing   P.O. Box 1428   6501 BK Nijmegen    The Netherlands
__/ \__/ ____________________________________________________________________

jmorriso@ee.ubc.ca (John Paul Morrison) (11/13/90)

The Eigenvector, eigenvalue problem is fairly easy.

write this program:
	<< A I L * - DET >>
Put your matrix in A, the identity matrix in I and then hit solve.
It helps to try and get a reasonable guess first.

This isn't exactly an EFFICIENT method, but it is EASY to do!

Eigenvectors are more tricky:
you have to calculate A-LI where L is the eigenvalue in question.
you can't just put a zero vector  on the stack and divide to get the answer.
You will just get the trivial zero solution.

Instead: find INV(A-L*I).
You will get a weird looking matrix, since a division by zero was done
somewhere.
But you can scale the matrix down, and you will have n linearly
dependent solutions.

The column vectors of the matrix are the eigenvectors. With a bit of
common sense,
(ie fiddling) you should be able  to see how many INdependent
eigenvectors there are.

To get the column vectors try this short program I wrote:
it is A\->V

%%HP: T(3)A(D)F(.);
\<< TRN ARRY\-> LIST\->
DROP DUP 1 - NEG \->
n m m1
  \<< m n 1 - * 1 + n
    FOR i m \->ARRY i
ROLLD m1
    STEP n
  \>>
\>>


and the reverse  process V\->A
takes n vectors and puts them in a matrix

%%HP: T(3)A(D)F(.);
\<< OVER SIZE 1 GET
DUP 1 - \-> n m m1
  \<< n m n 1 - * 1 +
    FOR i i ROLL
ARRY\-> DROP m1
    STEP { n m }
\->ARRY TRN
  \>>
\>>


***both these programs have trivial bugs which I have no intention of
fixing right  now.
A\->V won't work for a 1 * n matrix. V\->A won't work for 1 dimensional
vecotrs. If you don't
like it, don't do it.

Putting it all together:
calculate A I L * - INV A\->V

which gives n vectors. Divide any vector by it absolute value or its row
norm (RNRM) and 
then you have an eigenvector that is normalized or scaled to a useful
value.