A NOISE-TOLERANT QUASI-NEWTON ALGORITHM FOR UNCONSTRAINED OPTIMIZATION

Hao Jun M. Shi, Yuchen Xie, Richard Byrd, Jorge Nocedal

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

This paper describes an extension of the BFGS and L-BFGS methods for the minimization of a nonlinear function subject to errors. This work is motivated by applications that contain computational noise, employ low-precision arithmetic, or are subject to statistical noise. The classical BFGS and L-BFGS methods can fail in such circumstances because the updating procedure can be corrupted and the line search can behave erratically. The proposed method addresses these difficulties and ensures that the BFGS update is stable by employing a lengthening procedure that spaces out the points at which gradient differences are collected. A new line search, designed to tolerate errors, guarantees that the Armijo-Wolfe conditions are satisfied under most reasonable conditions, and works in conjunction with the lengthening procedure. The proposed methods are shown to enjoy convergence guarantees for strongly convex functions. Detailed implementations of the methods are presented, together with encouraging numerical results.

Original languageEnglish (US)
Pages (from-to)29-55
Number of pages27
JournalSIAM Journal on Optimization
Volume32
Issue number1
DOIs
StatePublished - 2022
Externally publishedYes

Keywords

  • derivative-free optimization
  • noisy optimization
  • nonlinear optimization
  • quasi-Newton method
  • stochastic optimization
  • unconstrained optimization

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science

Fingerprint

Dive into the research topics of 'A NOISE-TOLERANT QUASI-NEWTON ALGORITHM FOR UNCONSTRAINED OPTIMIZATION'. Together they form a unique fingerprint.

Cite this