Skip to content
Toolbox for Bayesian Optimization and Model-Based Optimization in R
Branch: master
Clone or download
Travis CI
Travis CI update auto-generated documentation [ci skip]
fix travis.yml
Latest commit 8a1dd46 Mar 5, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
R geom_raster Mar 4, 2019
docs update auto-generated documentation [ci skip] Mar 5, 2019
inst add citation information (#380) Mar 23, 2017
man-roxygen fix spelling (#398) Aug 8, 2017
man
src native routines (#357) Feb 20, 2017
tests improve tests Mar 1, 2019
todo-files
vignettes typo (fixes #438) Aug 13, 2018
.Rbuildignore test little less strict Mar 23, 2018
.editorconfig updated README, added NEWS Nov 18, 2016
.gitignore
.ignore ignore file + desc [ci skip] Mar 9, 2017
.travis.yml fix travis.yml Mar 5, 2019
DESCRIPTION
LICENSE
NAMESPACE
NEWS.md CMAES restarts Fix Issue #435 (#437) Aug 3, 2018
README.md
_pkgdown.yml release ready Jun 20, 2018
appveyor.yml fix appveyor Jun 6, 2017
codecov.yml
mlrMBO.Rproj

README.md

mlrMBO

CRAN_Status_Badge Build Status Build status Coverage Status Monthly RStudio CRAN Downloads

Model-based optimization with mlr.

Installation

We reccomend to install the official release version:

install.packages("mlrMBO")

For experimental use you can install the latest development version:

devtools::install_github("mlr-org/mlrMBO")

Introduction

MBO demo

mlrMBO is a highly configurable R toolbox for model-based / Bayesian optimization of black-box functions.

Features:

  • EGO-type algorithms (Kriging with expected improvement) on purely numerical search spaces, see Jones et al. (1998)
  • Mixed search spaces with numerical, integer, categorical and subordinate parameters
  • Arbitrary parameter transformation allowing to optimize on, e.g., logscale
  • Optimization of noisy objective functions
  • Multi-Criteria optimization with approximated Pareto fronts
  • Parallelization through multi-point batch proposals
  • Parallelization on many parallel back-ends and clusters through batchtools and parallelMap

For the surrogate, mlrMBO allows any regression learner from mlr, including:

  • Kriging aka. Gaussian processes (i.e. DiceKriging)
  • random Forests (i.e. randomForest)
  • and many more...

Various infill criteria (aka. acquisition functions) are available:

  • Expected improvement (EI)
  • Upper/Lower confidence bound (LCB, aka. statistical lower or upper bound)
  • Augmented expected improvement (AEI)
  • Expected quantile improvement (EQI)
  • API for custom infill criteria

Objective functions are created with package smoof, which also offers many test functions for example runs or benchmarks.

Parameter spaces and initial designs are created with package ParamHelpers.

mlrMBO - How to Cite and Citing Publications

Please cite our arxiv paper (Preprint). You can get citation info via citation("mlrMBO") or copy the following BibTex entry:

@article{mlrMBO,
  title = {{{mlrMBO}}: {{A Modular Framework}} for {{Model}}-{{Based Optimization}} of {{Expensive Black}}-{{Box Functions}}},
  url = {http://arxiv.org/abs/1703.03373},
  shorttitle = {{{mlrMBO}}},
  archivePrefix = {arXiv},
  eprinttype = {arxiv},
  eprint = {1703.03373},
  primaryClass = {stat},
  author = {Bischl, Bernd and Richter, Jakob and Bossek, Jakob and Horn, Daniel and Thomas, Janek and Lang, Michel},
  date = {2017-03-09},
}

Some parts of the package were created as part of other publications. If you use these parts, please cite the relevant work appropriately:

You can’t perform that action at this time.