Global Convergence Properties of Conjugate Gradient Methods for Optimization

Jean Charles Gilbert, Jorge Nocedal

Research output: Contribution to journalArticlepeer-review

Abstract

This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher–Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak–Ribière method. Numerical experiments are presented.
Original languageEnglish
Pages (from-to)21-42
JournalSIAM Journal on Optimization
Volume2
DOIs
StatePublished - 1992

Fingerprint Dive into the research topics of 'Global Convergence Properties of Conjugate Gradient Methods for Optimization'. Together they form a unique fingerprint.

Cite this