Conjugate direction methods with variable storage

Larry Nazareth*, Jorge Nocedal

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


In this paper we study conjugate gradient algorithms for large optimization problems. These methods accelerate (or precondition) the conjugate gradient method by means of quasi-Newton matrices, and are designed to utilize a variable amount of storage, depending on how much information is retained in the quasi-Newton matrices. We are concerned with the behaviour of such methods on the underlying quadratic model, and in particular, with finite termination properties.

Original languageEnglish (US)
Pages (from-to)326-340
Number of pages15
JournalMathematical Programming
Issue number1
StatePublished - Dec 1982


  • Conjugate Gradient
  • Optimization
  • Quasi-Newton

ASJC Scopus subject areas

  • Software
  • General Mathematics


Dive into the research topics of 'Conjugate direction methods with variable storage'. Together they form a unique fingerprint.

Cite this