Easy Hyper Parameter Optimization with mlr and mlrMBO.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
R
docs
man-roxygen
man
meta
tests
todo-files
vignettes
.Rbuildignore
.gitignore
.travis.yml
DESCRIPTION
LICENSE
NAMESPACE
README.Rmd
README.md
_pkgdown.yml
appveyor.yml
mlrHyperopt.Rproj

README.md

mlrHyperopt

Build Status Linux Build Status Windows Coverage Status

Easy Hyper Parameter Optimization with mlr and mlrMBO.

Installation

devtools::install_github("berndbischl/ParamHelpers") # version >= 1.11 needed.
devtools::install_github("jakob-r/mlrHyperopt", dependencies = TRUE)

Purpose

mlrHyperopt aims at making hyperparameter optimization of machine learning methods super simple. It offers tuning in one line:

library(mlrHyperopt)
res = hyperopt(iris.task, learner = "classif.svm")
res
## Tune result:
## Op. pars: cost=12.6; gamma=0.0159
## mmce.test.mean=0.02

Mainly it uses the learner implemented in mlr and uses the tuning methods also available in mlr. Unfortunately mlr lacks of well defined search spaces for each learner to make hyperparameter tuning easy.

mlrHyperopt includes default search spaces for the most common machine learning methods like random forest, svm and boosting.

As the developer can not be an expert on all machine learning methods available for R and mlr, mlrHyperopt also offers a web service to share, upload and download improved search spaces.

Development Status

Web Server

ParConfigs are up- and downloaded via JSON and stored on the server in a database. It's a very basic Ruby on Rails CRUD App generated via scaffolding with tiny modifications https://github.com/jakob-r/mlrHyperoptServer. ToDo: * Voting System * Upload-/Download Count * Improve API * Return existing ID when a duplicate is uploaded (instead of error). * Allow a combined search (instead of one key value pair).

R package

Basic functionality works reliable. Maybe I will improve the optimization heuristics in the future. It still needs more default search spaces for popular learners!

Reproducibility

This package is still under construction and the inner workings might change without a version number update. Thus I do not recommend the usage for reproducible research until it is on CRAN. For reproducible research you might want to stick to the more lengthly but more precise mlr tuning workflow. You can still use the Parameter Sets recommended in mlrHyperopt. Just make sure to write them in your source code.

Collaboration

Is encouraged! 👍