You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
in addition to being able to set up the algo's parameters after construction, we should probably provide a kwargs ctor in python to set up the parameters upon construction (for consistency with other algos)
implement missing algorithm configuration options, such as nlopt_set_xtol_abs() (absolute tolerance on parameters with a vector of tolerances rather than a single tolerance value valid for all components), nlopt_set_initial_step() (initial step size for derivative-free algorithms), nlopt_set_vector_storage() (vector storage for limited-memory quasi-Newton algorithms)
add support for the global optimisation algorithms (this essentially just requires to add a handful of config options which are not yet exposed, because they are meaningful only for the global opt algos)
add support for hessians preconditioning (still experimental in NLopt)
implement a cache for avoiding repeated calls to problem::fitness(). pagmo computes objfun and contraints in a single call to problem::fitness(), but NLopt (and, presumably, other local optimisation libraries) separate the computation of objfun and constraints in different functions. This means that our local optimisation wrappers might end up calling fitness() repeatedly with the same decision vector. The idea is then to code a cache that remembers the result of the last N calls to fitness() (and maybe gradient() as well?), in order to avoid wasting cpu cycles.
#67 introduces a first iteration of the NLopt wrappers in pagmo. Random dump of possible improvements:
NLOPT_AUGLAG
andNLOPT_AUGLAG_EQ
(Nlopt auglag + missing serialization #75)nlopt_set_xtol_abs()
(absolute tolerance on parameters with a vector of tolerances rather than a single tolerance value valid for all components),nlopt_set_initial_step()
(initial step size for derivative-free algorithms),nlopt_set_vector_storage()
(vector storage for limited-memory quasi-Newton algorithms)problem::fitness()
. pagmo computes objfun and contraints in a single call toproblem::fitness()
, but NLopt (and, presumably, other local optimisation libraries) separate the computation of objfun and constraints in different functions. This means that our local optimisation wrappers might end up callingfitness()
repeatedly with the same decision vector. The idea is then to code a cache that remembers the result of the last N calls to fitness() (and maybe gradient() as well?), in order to avoid wasting cpu cycles.See http://ab-initio.mit.edu/wiki/index.php/NLopt_Reference
The text was updated successfully, but these errors were encountered: