[comp.ai.neural-nets] Summary on "What is Conjugate Gradient Method??"

akkh@mullian.ee.mu.OZ.AU (A. Hui) (09/12/90)

Hi,

Since I last post the query, I received a lot of response BUT
most are requesting me to send the respond I got. So, here it is:

Basically, the best reference is "Numerical Recipes"

Thanks very much to all those who response!!!

Cheers!!
================================================================

From: Summeren van Peter <pjhamvs@cs.vu.nl>

Hello Hui,
here is an answer:
1. read Numerical Recipes (in C)
2. read :
	 M. J. D. Powell, "Restart Procedures for the Conjugate Gradient
	 Method," Mathematical Programming, Vol. 12, pp. 241-254,
	 April 1977.
3. get  : the program "opt".
	 It was on this net a month ago.
	 contact: vincew@cse.ogc.edu
		  opt-dist@cse.ogc.edu
		  opt-bugs@cse.ogc.edu
		 
  These are all the same adresses: I forgot the ftp adress.

-----------------------------------------------------------------------
From: ins_atge@jhunix.HCF.JHU.EDU (Thomas G Edwards)

If we look at the history of neural networks since 1975, we see that the
lack of formal numerical analysis experience of connectionist
researchers has been a continuing hinderance to the field.  But
enough of my philosophy.

Conjugate-Gradient is explained fairly well in _Numerical_Recipes_.  Yes,
it involves lots of math to implement.  Basically, it is much like
steepest descent function minimizers.  They find the negative gradient of
the function at a point, and perform a line minimization in the direction
of the negative gradient.  Line minimizations, being in a single direction,
are much easier than multidimensional minimization.  Once we have found
the minima along the negative gradient direction, we jump to that point,
find the new gradient direction, perform another line minimization,
and iterate again.  Eventually, the function should be minimized.
Note that using the line minimization allows us to fly over vast plains
of near zero gradients, where fixed step-size gradient methods would get
trapped, helplessly crawling along.

Conjugate gradient is an improvement to steepest descent.  Basically,
it allows all search directions to be more or less linearly independent
and minimizing.  The effect is difficult to imagine, but _Numerical_
Recpies_ shows how steepest-descent can criss-cross along a valley,
where conjugate gradient avoids re-minimizations along past directions.

Even if you just want to use simple steepest-descent, it is a vast
improvement over fixed step-size methods.  The hardest part is the
line minimization, though, and you will have to do a little bit of
looking to find an appropriate algorithm.

-Thomas Edwards
--------------------------------------------------------------------
From: suter@latcs1.oz (David Suter)

Try the Numerical Recipes book - not only does it give educated layman's
overview of all algorithms - it also gives code.
d.s.


--------------------------------------------------------------------
From: wwen@trlamct.trl.oz (Wilson Wen)

I believe that the best way to get the basic idea is to have a look
of the original paper of R. Fletcher & C. M. Reeves entitled

Function minimization by conjugate gradients (Computer J. 1964,
7:149-154)

Although it is only 5 pages in length but it tells everything about
CG method and even includes a short Algol program for you to play
with.

I used it to solve Minimum Cross Entropy problems with linear
constraints
and reported the result in one of my technical reports:

W.X. Wen, "Analytical and numerrical methods for minimum coross entropy
problems", Tech. Rep. 88/26, Computer Science, The University of
Melbourne.

which also includes a short program in C to solve CG problem. If you
have
any problem to get the report from Computer Science Dept, just email me
and I'll send a copy to you.

Cheers.


Wilson Xun Wen

=====================================
Senior Scientist
Artificial Intelligence Systems
Telecom Research Laboratories
P. O. Box 279                   Phone No. :(613/03) 541-6273 (Office)
770 Blackburn Road                         (613/03) 872-3921 (Home)
Clayton, Victoria 3168          Fax No.   :(613/03) 543-8863
Australia                       Email addr:(ACSnet) w.wen@.trl.oz.au
===============================================================================
	Alvaro Hui		|ACSnet		akkh@mullian.oz
    4th Year B.E.\ B.Sc.	|Internet &	akkh@mullian.ee.mu.OZ.AU
   University of Melbourne	|Arpanet	rcoahk@koel.co.rmit.OZ.AU