Analysis of reduced Hessian methods for constrained optimization

Richard H. Byrd*, Jorge Nocedal

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

44 Scopus citations

Abstract

We study the convergence properties of reduced Hessian successive quadratic programming for equality constrained optimization. The method uses a backtracking line search, and updates an approximation to the reduced Hessian of the Lagrangian by means of the BFGS formula. Two merit functions are considered for the line search: the l1 function and the Fletcher exact penalty function. We give conditions under which local and superlinear convergence is obtained, and also prove a global convergence result. The analysis allows the initial reduced Hessian approximation to be any positive definite matrix, and does not assume that the iterates converge, or that the matices are bounded. The effects of a second order correction step, a watchdog procedure and of the choice of null space basis are considered. This work can be seen as an extension to reduced Hessian methods of the well known of Powell (1976) for unconstrained optimization.

Original languageEnglish (US)
Pages (from-to)285-323
Number of pages39
JournalMathematical Programming, Series B
Volume49
Issue number3
StatePublished - Jan 1991

ASJC Scopus subject areas

  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'Analysis of reduced Hessian methods for constrained optimization'. Together they form a unique fingerprint.

Cite this