Enriched methods for large-scale unconstrained optimization

José Luis Morales*, Jorge Nocedal

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

32 Scopus citations


This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (L-BFGS) and a Hessian-free Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing an initial matrix for L-BFGS iterations. The lengths of the L-BFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the new algorithms are both effective and not sensitive to the choice of parameters.

Original languageEnglish (US)
Pages (from-to)143-154
Number of pages12
JournalComputational Optimization and Applications
Issue number2
StatePublished - Feb 2002


  • Conjugate gradient method
  • Hessian-free Newton method
  • L-BFGS
  • Limited memory method
  • Quasi-Newton preconditioning
  • Truncated Newton method

ASJC Scopus subject areas

  • Control and Optimization
  • Computational Mathematics
  • Applied Mathematics


Dive into the research topics of 'Enriched methods for large-scale unconstrained optimization'. Together they form a unique fingerprint.

Cite this