Skip to content

Latest commit

 

History

History
65 lines (46 loc) · 2.07 KB

tutorial.SKOptLearner.rst

File metadata and controls

65 lines (46 loc) · 2.07 KB

Tutorial ~adaptive.SKOptLearner

Note

Because this documentation consists of static html, the live_plot and live_info widget is not live. Download the notebook in order to see the real behaviour.

The complete source code of this tutorial can be found in :jupyter-downloadtutorial.SKOptLearner

import adaptive adaptive.notebook_extension(_inline_js=False)

import holoviews as hv import numpy as np

We have wrapped the Optimizer class from scikit-optimize, to show how existing libraries can be integrated with adaptive.

The SKOptLearner attempts to “optimize” the given function g (i.e. find the global minimum of g in the window of interest).

Here we use the same example as in the scikit-optimize tutorial. Although SKOptLearner can optimize functions of arbitrary dimensionality, we can only plot the learner if a 1D function is being learned.

def F(x, noise_level=0.1):
return (np.sin(5 * x) * (1 - np.tanh(x ** 2))
  • np.random.randn() * noise_level)
learner = adaptive.SKOptLearner(F, dimensions=[(-2., 2.)],

base_estimator="GP", acq_func="gp_hedge", acq_optimizer="lbfgs",

)

runner = adaptive.Runner(learner, ntasks=1, goal=lambda l: l.npoints > 40)

await runner.task # This is not needed in a notebook environment!

runner.live_info()

%%opts Overlay [legend_position='top'] xs = np.linspace(*learner.space.bounds[0]) to_learn = hv.Curve((xs, [F(x, 0) for x in xs]), label='to learn')

runner.live_plot().relabel('prediction', depth=2) * to_learn