Skip to content
master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
R
 
 
 
 
 
 
man
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

README.md

mlr3tuning

Package website: release | dev

tic CRAN Status StackOverflow Mattermost

This package provides hyperparameter tuning for mlr3. It offers various tuning methods e.g. grid search, random search and generalized simulated annealing and different termination criteria can be set and combined. 'AutoTuner' provides a convenient way to perform nested resampling in combination with 'mlr3'. The package is build on bbotk which provides a common framework for optimization.

Installation

CRAN version

install.packages("mlr3tuning")

Development version

remotes::install_github("mlr-org/mlr3tuning")

Example

library("mlr3")
library("mlr3tuning")
library("paradox")

task = tsk("pima")
learner = lrn("classif.rpart")
resampling = rsmp("holdout")
measure = msr("classif.ce")

# Create the search space with lower and upper bounds
search_space = ParamSet$new(list(
  ParamDbl$new("cp", lower = 0.001, upper = 0.1),
  ParamInt$new("minsplit", lower = 1, upper = 10)
))

# Define termination criterion
terminator = trm("evals", n_evals = 20)

# Create tuning instance
instance = TuningInstanceSingleCrit$new(task = task,
  learner = learner,
  resampling = resampling,
  measure = measure,
  search_space = search_space,
  terminator = terminator)

# Load tuner
tuner = tnr("grid_search", resolution = 5)

# Trigger optimization
tuner$optimize(instance)

# View results
instance$result

Resources

Further documentation can be found in the mlr3book and the mlr3tuning cheatsheet. Tutorials are available in the mlr3gallery.

You can’t perform that action at this time.