Abstract
The goal of this paper is to investigate an approach for derivative-free optimization that has not received sufficient attention in the literature and is yet one of the simplest to implement and parallelize. In its simplest form, it consists of employing derivative-based methods for unconstrained or constrained optimization and replacing the gradient of the objective (and constraints) by finite-difference approximations. This approach is applicable to problems with or without noise in the functions. The differencing interval is determined by a bound on the second (or third) derivative and by the noise level, which is assumed to be known or to be accessible through difference tables or sampling. The use of finite-difference gradient approximations has been largely dismissed in the derivative-free optimization literature as too expensive in terms of function evaluations or as impractical in the presence of noise. However, the test results presented in this paper suggest that it has much to be recommended. The experiments compare newuoa, dfo-ls and cobyla against finite-difference versions of l-bfgs, lmder and knitro on three classes of problems: general unconstrained problems, nonlinear least squares problems and nonlinear programs with inequality constraints.
Original language | English (US) |
---|---|
Pages (from-to) | 289-311 |
Number of pages | 23 |
Journal | Optimization Methods and Software |
Volume | 38 |
Issue number | 2 |
DOIs | |
State | Published - 2023 |
Funding
The work of Hao-Jun Michael Shi was supported by the Office of Naval Research grant N00014-14-1-0313 P00003. The work of Melody Qiming Xuan was supported by the National Science Foundation grant DMS-1620022. The work of Jorge Nocedal was supported by the Office of Naval Research grant N00014-14-1-0313 P00003, and by National Science Foundation grant DMS-1620022. We are grateful to Richard Byrd, Oliver Zhuoran Liu and Yuchen Xie for their feedback on this work. We also thank Philip Gill, Arnold Neumaier, Michael Saunders, Katya Scheinberg, Luis Nunes Vicente and Stefan Wild for their correspondence, which helped guide the design of our experiments. We also acknowledge a number of valuable suggestions by the referees and associate editor.
Keywords
- Derivative-free optimization
- finite differences
- noisy optimization
- nonlinear optimization
- zeroth-order optimization
ASJC Scopus subject areas
- Software
- Control and Optimization
- Applied Mathematics