Global Convergence Properties of Conjugate Gradient Methods for Optimization

Jean Charles Gilbert, Jorge Nocedal

Research output: Contribution to journalArticlepeer-review


This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher–Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak–Ribière method. Numerical experiments are presented.
Original languageEnglish
Pages (from-to)21-42
JournalSIAM Journal on Optimization
StatePublished - 1992


Dive into the research topics of 'Global Convergence Properties of Conjugate Gradient Methods for Optimization'. Together they form a unique fingerprint.

Cite this