Skip to content
Bayesian optimization for Julia
Branch: master
Clone or download
2
Latest commit 6594f83 Jun 12, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
examples fix yticklabels Jun 12, 2019
src export and document new Initializers Jun 12, 2019
test default to Sobol seq init; prepare new release Jun 12, 2019
.travis.yml test on julia 1.1 Jan 17, 2019
LICENSE.md initial commit Oct 12, 2018
Project.toml add compat section Jun 12, 2019
README.md export and document new Initializers Jun 12, 2019

README.md

BayesianOptimization

Lifecycle Build Status codecov.io

Usage

using BayesianOptimization, GaussianProcesses, Distributions

f(x) = sum((x .- 1).^2) + randn()                # noisy function to minimize

# Choose as a model an elastic GP with input dimensions 2.
# The GP is called elastic, because data can be appended efficiently.
model = ElasticGPE(2,                            # 2 input dimensions
                   mean = MeanConst(0.),         
                   kernel = SEArd([0., 0.], 5.),
                   logNoise = 0.,
                   capacity = 3000)              # the initial capacity of the GP is 3000 samples.
set_priors!(model.mean, [Normal(1, 2)])

# Optimize the hyperparameters of the GP using maximum a posteriori (MAP) estimates every 50 steps
modeloptimizer = MAPGPOptimizer(every = 50, noisebounds = [-4, 3],       # bounds of the logNoise
                                kernbounds = [[-1, -1, 0], [4, 4, 10]],  # bounds of the 3 parameters GaussianProcesses.get_param_names(model.kernel)
                                maxeval = 40)
opt = BOpt(f,
           model,
           UpperConfidenceBound(),                # type of acquisition
           modeloptimizer,                        
           [-5., -5.], [5., 5.],                  # lowerbounds, upperbounds         
           repetitions = 5,                       # evaluate the function for each input 5 times
           maxiterations = 100,                   # evaluate at 100 input positions
           sense = Min,                           # minimize the function
           verbosity = Progress)

result = boptimize!(opt)

To continue the optimization, one can call boptimize!(opt) multiple times.

result = boptimize!(opt) # first time (includes initialization)
result = boptimize!(opt) # restart
opt.maxiterations = 50   # set maxiterations for the next call
result = boptimize!(opt) # restart again

This package exports

  • BOpt, boptimize!
  • acquisition types: ExpectedImprovement, ProbabilityOfImprovement, UpperConfidenceBound, ThompsonSamplingSimple, MutualInformation
  • scaling of standard deviation in UpperConfidenceBound: BrochuBetaScaling, NoBetaScaling
  • GP hyperparameter optimizer: MAPGPOptimizer, NoModelOptimizer
  • Initializer: ScaledSobolIterator, ScaledLHSIterator
  • optimization sense: Min, Max
  • verbosity levels: Silent, Timings, Progress

Use the REPL help, e.g. ?Bopt, to get more information.

Review papers on Bayesian optimization

Similar Projects

BayesOpt is a wrapper of the established BayesOpt toolbox written in C++.

Dragonfly is a feature-rich package for scalable Bayesian optimization written in Python. Use it in Julia with PyCall.

You can’t perform that action at this time.