Derivative-free optimization of noisy functions via quasi-newton methods

Albert S. Berahas, Richard H. Byrd, Jorge Nocedal

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a finite difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval h based on the noise estimation techniques of Hamming [18] and Moré and Wild [34]. This noise estimation procedure and the selection of h are inexpensive but not always accurate, and to prevent failures the algorithm incorporates a recovery mechanism that takes appropriate action in the case when the line search procedure is unable to produce an acceptable point. A novel convergence analysis is presented that considers the effect of a noisy line search procedure. Numerical experiments comparing the method to a function interpolating trust region method are presented.

Original languageEnglish (US)
JournalUnknown Journal
StatePublished - Mar 27 2018

Keywords

  • Derivative-free optimization
  • Nonlinear optimization
  • Stochastic optimization

ASJC Scopus subject areas

  • General

Fingerprint Dive into the research topics of 'Derivative-free optimization of noisy functions via quasi-newton methods'. Together they form a unique fingerprint.

Cite this