mwang@watmath.UUCP (mwang) (02/23/84)
_D_E_P_A_R_T_M_E_N_T _O_F _C_O_M_P_U_T_E_R _S_C_I_E_N_C_E
_U_N_I_V_E_R_S_I_T_Y _O_F _W_A_T_E_R_L_O_O
_S_E_M_I_N_A_R _A_C_T_I_V_I_T_I_E_S
_N_U_M_E_R_I_C_A_L _A_N_A_L_Y_S_I_S _S_E_M_I_N_A_R
- Friday, March 2, 1984.
Dr. M.L. Overton of Courant Institute, New York Univer-
sity, will speak on ``Projected Hessian Updating Algo-
rithms for Nonlinearly Constrained Optimization.''
TIME: 2:30 PM (Please Note)
ROOM: MC 6091A
ABSTRACT
Consider the nonlinearly constrained optimization prob-
lem:
min f(x) subject to c(x) = 0 , i = 1,...,m.
where the minimization is over x in R and where f(x)
and the {c(x)} are smooth functions.
We first discuss several Newton-type methods which have
been proposed for solving this problem. We then con-
sider quasi-Newton methods, which avoid forming Hessian
matrices but update an appropriate approximating ma-
trix. The standard method updates an approximation to
the full Hessian of the Lagrangian function. We sug-
gest a new Broyden-type method which was suggested by
the work of Goodman. This method updates an n x (n-m)
matrix which approximates a ``one-sided projected Hes-
sian'' of a Lagrangian function. The method is easily
shown to have a local one-step Q-superlinear conver-
gence property. We also give a new proof of the
Boggs-Tolle-Wang necessary and sufficient condition for
Q-superlinear convergence of a class of quasi-Newton
methods for solving this problem. Finally, we describe
an algorithm which updates an approximation to a
``two-sided projected Hessian,'' a symmetric matrix of
order n-m which can be expected to be positive definite
near a solution. This idea was suggested in 1978 by
Murray and Wright and a particular variant has recently
been analysed by Coleman and Conn. We present several
new variants of this algorithm and show that under cer-
tain conditions they all have a local two-step Q-
superlinear convergence property, even though only one
set of gradients is evaluated per iteration.
February 23, 1984