Skip to content
This repository has been archived by the owner on Jan 20, 2022. It is now read-only.
/ mlr3learners.lgbpy Public archive

R package providing a lightgbm mlr3-learner extension

License

Notifications You must be signed in to change notification settings

kapsner/mlr3learners.lgbpy

Repository files navigation

mlr3learners.lgbpy (!!!under development!!!)

pipeline status coverage report

mlr3learners.lgbpy brings the LightGBM gradient booster to the mlr3 framework by using the lightgbm.py R implementation.

Features

  • integrated native cross-validation (CV) step before the actual model training to find the optimal num_boost_round for the given training data and parameter set
  • GPU support

Installation

Install the mlr3learners.lgbpy R package:

install.packages("devtools")
devtools::install_github("kapsner/mlr3learners.lgbpy")

In order to use the mlr3learners.lgbpy R package, please make sure, the reticulate R package is configured properly on your system (reticulate version >= 1.14) and is pointing to a python environment. If not, you can e.g. install miniconda:

reticulate::install_miniconda(
  path = reticulate::miniconda_path(),
  update = TRUE,
  force = FALSE
)
reticulate::py_config()

Use the function lightgbm.py::install_py_lightgbm in order to install the lightgbm python module. This function will first look, if the reticulate package is configured well and if the python module lightgbm is aready present. If not, it is automatically installed.

lightgbm.py::install_py_lightgbm()

Example

library(mlr3)
task = mlr3::tsk("iris")
learner = mlr3::lrn("classif.lightgbm")

learner$early_stopping_rounds <- 1000
learner$num_boost_round <- 5000

learner$param_set$values <- list(
  "objective" = "multiclass",
  "learning_rate" = 0.01,
  "seed" = 17L
)

learner$train(task, row_ids = 1:120)
predictions <- learner$predict(task, row_ids = 121:150)

For further information and examples, please view the mlr3learners.lgbpy package vignettes, the mlr3book and the vignettes of the lightgbm.py R package.

GPU acceleration

The mlr3learners.lgbpy can also be used with lightgbm's GPU compiled version.

To install the lightgbm python package with GPU support, execute the following commands (lightgbm manual):

pip install lightgbm --install-option=--gpu

In order to use the GPU acceleration, the parameter device_type = "gpu" (default: "cpu") needs to be set. According to the LightGBM parameter manual, 'it is recommended to use the smaller max_bin (e.g. 63) to get the better speed up'.

learner$param_set$values <- list(
  "objective" = "multiclass",
  "learning_rate" = 0.01,
  "seed" = 17L,
  "device_type" = "gpu",
  "max_bin" = 63L
)

All other steps are similar to the workflow without GPU support.

The GPU support has been tested in a Docker container running on a Linux 19.10 host, Intel i7, 16 GB RAM, an NVIDIA(R) RTX 2060, CUDA(R) 10.2 and nvidia-docker.

More Infos:

About

R package providing a lightgbm mlr3-learner extension

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages