Derivative-Free Optimization (DFO)
DFO solvers are aimed at optimizing black box models and can handle either calibration (nonlinear least squares) problems (DFLS) or problems with a generic objective function (DFNO).
-
Calibration: DFLS (Derivative Nonlinear least squares) [
handle_solve_dfls
|e04fff
|e04ffc
] -
DFNO (Derivative-Free Nonlinear Optimization) [
handle_solve_dfno
|e04jdf
|e04jdc
]
Optimizing complex numerical models is one of the most common problems found in the industry (finance, multi-physics simulations, engineering, etc.). To solve these optimization problems with a standard optimization algorithm such as Gauss–Newton (for problems with a nonlinear least squares structure) or CG (for unstructured nonlinear objective) requires good estimates of the model's derivatives. If exact derivatives are easy to compute then using derivative-based methods is preferable. However, explicitly writing the derivatives or applying AD methods might be impossible if the model is a black box. The alternative, estimating derivatives via finite differences, can quickly become impractical or too computationally expensive. Under these circumstances, an attractive optimization solver that does not require the user to provide any derivatives is the model-based DFO solver.
NAG's model-based DFO solvers for DFLS and DFNO present a number of attractive features:
- Proved resilient to noise,
- The least-square solver is able to start making progress with as few as two objective evaluations,
- Integrated to the NAG Optimization Modeling Suite (NOMS) with simple interfaces for the solvers and related routines,
- Optional reverse communication interface.
Figure 1. Animation showing 2 iterations of a model-based DFO algorithm handle_solve_dfls
.
A 2019 poster discussing NAG's DFO/DFLS functionality is available on the NAG website. |
|
The Jupyter notebook showcases the optimization of noisy problems where the objective function is not deterministic. The example discuses and illustrates the advantages of using a DFO solver instead of a derivative-based solver using finite difference estimations for the gradient.
-
Blog post from the OptCorner The price of derivatives - Derivative-free Optimization
-
Examples [ Python example | C example | Fortran 90 example ]
-
Examples [ Python example | C example | Fortran 90 example ]
- C. Cartis, J. Fiala, B. Marteau, and L. Roberts (2019) Improving the Flexibility and robustness of model-based derivative-free optimization solvers. ACM Transactions On Numerical Software.
- C. Cartis and L. Roberts (2017) A derivative-free Gauss–Newton method. Mathematical Programming Computation.
- Powell M. J. D. (2009) The BOBYQA algorithm for bound constrained optimization without derivatives. Report DAMTP 2009/NA06 University of Cambridge.
- Instructions on how to install the NAG Library for Python
- Instructions on how to run the Jupyter notebooks in the Repository