ADAPTIVE FINITE-DIFFERENCE INTERVAL ESTIMATION FOR NOISY DERIVATIVE-FREE OPTIMIZATION

Hao Jun Michael Shi, Yuchen Xie, Melody Qiming Xuan, Jorge Nocedal

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

A common approach for minimizing a smooth nonlinear function is to employ finite-difference approximations to the gradient. While this can be easily performed when no error is present within the function evaluations, when the function is noisy, the optimal choice requires information about the noise level and higher-order derivatives of the function, which is often unavailable. Given the noise level of the function, we propose a bisection search for finding a finite-difference interval for any finite-difference scheme that balances the truncation error, which arises from the error in the Taylor series approximation, and the measurement error, which results from noise in the function evaluation. Our procedure produces reliable estimates of the finite-difference interval at low cost without explicitly approximating higher-order derivatives. We show its numerical reliability and accuracy on a set of test problems. When combined with limited memory BFGS, we obtain a robust method for minimizing noisy black-box functions, as illustrated on a subset of unconstrained CUTEst problems with synthetically added noise.

Original languageEnglish (US)
Pages (from-to)A2302-A2321
JournalSIAM Journal on Scientific Computing
Volume44
Issue number4
DOIs
StatePublished - 2022

Funding

∗Submitted to the journal’s Methods and Algorithms for Scientific Computing section October 13, 2021; accepted for publication (in revised form) May 4, 2022; published electronically August 4, 2022. https://doi.org/10.1137/21M1452470 Funding: The work of the first and second authors was supported by the Office of Naval Research grant N00014-14-1-0313 P00003. The work of the third author was supported by the National Science Foundation grant DMS-1620022. The work of the fourth author was supported by AFOSR grant FA95502110084 and by National Science Foundation grant DMS-1620022.

Keywords

  • derivative-free optimization
  • finite differences
  • noisy optimization
  • nonlinear optimization
  • zeroth-order optimization

ASJC Scopus subject areas

  • Computational Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'ADAPTIVE FINITE-DIFFERENCE INTERVAL ESTIMATION FOR NOISY DERIVATIVE-FREE OPTIMIZATION'. Together they form a unique fingerprint.

Cite this