On the limited memory BFGS method for large scale optimization

Dong C. Liu*, Jorge Nocedal

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cycles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence. We show that the L-BFGS method can be greatly accelerated by means of a simple scaling. We then compare the L-BFGS method with the partitioned quasi-Newton method of Griewank and Toint (1982a). The results show that, for some problems, the partitioned quasi-Newton method is clearly superior to the L-BFGS method. However we find that for other problems the L-BFGS method is very competitive due to its low iteration cost. We also study the convergence properties of the L-BFGS method, and prove global convergence on uniformly convex problems.

Original languageEnglish (US)
Pages (from-to)503-528
Number of pages26
JournalMathematical Programming, Series B
Volume45
Issue number3
StatePublished - Dec 1 1989

ASJC Scopus subject areas

  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'On the limited memory BFGS method for large scale optimization'. Together they form a unique fingerprint.

Cite this