Abstract
We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in addition, the search directions were well-scaled, but we show that this is not always the case. We find that the method has a major drawback: to achieve superlinear convergence it may be necessary to evaluate the function twice per iteration, even very near the solution. An example is constructed to show that the step-sizes required to achieve a superlinear rate converge to 2 and 0.5 alternately.
Original language | English (US) |
---|---|
Pages (from-to) | 19-37 |
Number of pages | 19 |
Journal | Mathematical Programming |
Volume | 61 |
Issue number | 1-3 |
DOIs | |
State | Published - Aug 1993 |
Keywords
- BFGS method
- Self-scaling
- optimization
- quasi-Newton method
ASJC Scopus subject areas
- Software
- General Mathematics