On the limited memory BFGS method for large scale optimization

Dong C. Liu*, Jorge Nocedal

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4047 Scopus citations

Abstract

We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cycles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence. We show that the L-BFGS method can be greatly accelerated by means of a simple scaling. We then compare the L-BFGS method with the partitioned quasi-Newton method of Griewank and Toint (1982a). The results show that, for some problems, the partitioned quasi-Newton method is clearly superior to the L-BFGS method. However we find that for other problems the L-BFGS method is very competitive due to its low iteration cost. We also study the convergence properties of the L-BFGS method, and prove global convergence on uniformly convex problems.

Original languageEnglish (US)
Pages (from-to)503-528
Number of pages26
JournalMathematical Programming
Volume45
Issue number1-3
DOIs
StatePublished - Aug 1989

Keywords

  • Large scale nonlinear optimization
  • conjugate gradient method
  • limited memory methods
  • partitioned quasi-Newton method

ASJC Scopus subject areas

  • Software
  • Mathematics(all)

Fingerprint

Dive into the research topics of 'On the limited memory BFGS method for large scale optimization'. Together they form a unique fingerprint.

Cite this