Derivative Free methods do not compute the gradient of a function and so are often used to minimize problems with nondifferentiable functions. Some derivative free methods will attempt to approximate the gradient using a finite difference approach, another class of methods constructs a linear or quadratic model of the objective functions and uses a trust region approach to find the next iterate. Another widely used derivative free method is the Nelder-Mead simplex method. [Nocedal]
To select all minimizers in fitbenchmarking that use derivative free methods, use the algorithm type deriv_free
.
Nelder-Mead is a simplex based algorithm, with a simplex S in
In an iteration of the algorithm, the idea is to remove the vertex with the worst function value. It is then replaced with another point with a better value. An iteration consists of the following steps:
- Ordering the vertices of S so that f(x1) ≤ f(x2) ≤ ... ≤ f(xn + 1)
- Calculating the centroid, x̄ of the best n points
$\bar{x} = \frac{1}{n} \sum_{i=1}^n x_i$ - Carry out a transformation to compute the new simplex. Try to replace only the worst vertex xn + 1 with a better point, which then becomes the new vertex. If this fails, the simplex is shrunk towards the best vertex x1 and n new vertices are computed. The algorithm terminates when the simplex S is sufficiently small. [Singer]
- Advantages:
- Method only requires 1 or 2 functions evaluations per iteration.
- Gives significant improvements in first few iterations - quick to produce satisfactory results.
- Disadvantages:
- Stagnation can occur at non-optimal points, with large numbers of iterations leading to negligible improvement even when nowhere near a minimum.
- If numerical computation of function derivative can be trusted, then other algorithms are more robust.
- Nocedal
Jorge Nocedal, Stephen J. Wright (2006), Numerical Optimization
- Singer
Saša Singer, John Nelder (2009) Nelder-Mead algorithm. Scholarpedia, 4(7):2928.