Derivative-free optimization of noisy functions via quasi-Newton methods

Albert S. Berahas, Richard H. Byrd, Jorge Nocedal

Research output: Contribution to journalArticlepeer-review

12 Scopus citations


This paper presents a finite-difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval h based on the noise estimation techniques of Hamming [Introduction to Applied Numerical Analysis, Courier Corporation, North Chelmsford, MA, 2012] and More and Wild [SIAM J. Sci. Comput., 33 (2011), pp. 1292-1314]. This noise estimation procedure and the selection of h are inexpensive but not always accurate, and to prevent failures the algorithm incorporates a recovery mechanism that takes appropriate action in the case when the line-search procedure is unable to produce an acceptable point. A novel convergence analysis is presented that considers the effect of a noisy line-search procedure. Numerical experiments comparing the method to a function interpolating trust-region method are presented.

Original languageEnglish (US)
Pages (from-to)965-993
Number of pages29
JournalSIAM Journal on Optimization
Issue number2
StatePublished - 2019


  • Derivative-free optimization
  • Nonlinear optimization
  • Stochastic optimization

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science


Dive into the research topics of 'Derivative-free optimization of noisy functions via quasi-Newton methods'. Together they form a unique fingerprint.

Cite this