The conjugate gradient method provides an iterative technique for function minimization. At each iteration, a new search direction is calculated based on the current gradient of the function and the previous search direction. Several formulas for the scalar weighting factor beta applied to the previous search direction have been developed over the years, all of which make use of the gradient of the function evaluated at the current solution and the previous one. A generalization that uses the gradient evaluated at any point along the search direction is investigated. For quadratic function minimization, this new formula for beta produces a set of conjugate search directions, as required.
|Original language||English (US)|
|Number of pages||2|
|Journal||Proceedings - Annual Allerton Conference on Communication, Control, and Computing|
|State||Published - Dec 1 1985|
ASJC Scopus subject areas