[sci.math] Polynomial Discriminant Method

tomh@proxftl.UUCP (Tom Holroyd) (11/30/88)

The Dec. '88 issue of Computer Language has an article on the PADALINE, a
polynomial ADALINE.  The only reference given is a Ph.D. dissertation
by Donald Specht (Stanford) written in the mid-60's.  Are there any
more recent references?

A brief description of the method:

Given two categories of patterns, A and B.  Construct a polynomial that
will yield a positive number given a vector XA from category A and a negative
number given a vector XB from category B.  Compute the constants of the
polynomial using the general formula:

	c            = 1/(z1!z2!...zp! * s^2h) *
	 z1,z2,...zp            m
		       [ 1/m * sum(XA  ^z1 * XA  ^z2 * ... * EXP   ) -
			       i=1   i1        i2               XA
								  i
				n
			 K/n * sum(XB  ^z1 * XB  ^z2 * ... * EXP   ) ]
			       i=1   i1        i2               XB
								  i

where z1...zp are the subscripts for the constants, s^2h is a smoothing
factor, XA   is the 1st element of the ith vector from the set of m vectors
	  i1
in category A.  XB   is the 1st element of the ith category B vector; there
		  i1
are n of these.  K is computed from the ratio of the number of A vectors to
the number of B vectors.  EXP  is exp(-L/2s^2) where L is the square of the
			     X
length of X.

What you do is compute each constant using the above formula, and then
construct a polynomial which, when given a vector from either category,
will yield the correct categorization.

You need to have all your vectors pre-categorized, of course.
It looks nice since you only run the calculation once, and after that
the polynomial just works.  You don't have to wait for something to
converge.  Adding a new pattern vector means re-doing the whole calculation,
tho.

Anybody actually used this and care to comment?  How many terms do you
need, say, if you have 1000 10-dimensional vectors?  Can you throw away
the higher order terms without hurting the categorization too much?

Tom Holroyd
UUCP: {uflorida,uunet}!novavax!proxftl!tomh

The white knight is talking backwards.