Analysis of a self-scaling quasi-Newton method

Jorge Nocedal*, Ya xiang Yuan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

48 Scopus citations

Abstract

We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in addition, the search directions were well-scaled, but we show that this is not always the case. We find that the method has a major drawback: to achieve superlinear convergence it may be necessary to evaluate the function twice per iteration, even very near the solution. An example is constructed to show that the step-sizes required to achieve a superlinear rate converge to 2 and 0.5 alternately.

Original languageEnglish (US)
Pages (from-to)19-37
Number of pages19
JournalMathematical Programming
Volume61
Issue number1-3
DOIs
StatePublished - Aug 1993

Keywords

  • BFGS method
  • Self-scaling
  • optimization
  • quasi-Newton method

ASJC Scopus subject areas

  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'Analysis of a self-scaling quasi-Newton method'. Together they form a unique fingerprint.

Cite this