Skip to content
LightGBM - the high performance machine learning library - for Ruby
Ruby Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
lib
test Packaged LightGBM with gem Sep 3, 2019
vendor Packaged LightGBM with gem Sep 3, 2019
.gitignore Ignore lockfiles [skip ci] Aug 14, 2019
.travis.yml
CHANGELOG.md Version bump to 0.1.5 [skip ci] Sep 3, 2019
Gemfile First commit Aug 14, 2019
LICENSE.txt
README.md Updated readme [skip ci] Sep 3, 2019
Rakefile Packaged LightGBM with gem Sep 3, 2019
appveyor.yml Packaged LightGBM with gem Sep 3, 2019
lightgbm.gemspec Added vendor directory to gemspec [skip ci] Sep 3, 2019

README.md

LightGBM

LightGBM - the high performance machine learning library - for Ruby

🔥 Uses the C API for blazing performance

Build Status

Installation

Add this line to your application’s Gemfile:

gem 'lightgbm'

On Mac, also install OpenMP:

brew install libomp

Getting Started

This library follows the Python API. A few differences are:

  • The get_ and set_ prefixes are removed from methods
  • The default verbosity is -1
  • With the cv method, stratified is set to false

Some methods and options are also missing at the moment. PRs welcome!

Training API

Prep your data

x = [[1, 2], [3, 4], [5, 6], [7, 8]]
y = [1, 2, 3, 4]

Train a model

params = {objective: "regression"}
train_set = LightGBM::Dataset.new(x, label: y)
booster = LightGBM.train(params, train_set)

Predict

booster.predict(x)

Save the model to a file

booster.save_model("model.txt")

Load the model from a file

booster = LightGBM::Booster.new(model_file: "model.txt")

Get the importance of features

booster.feature_importance

Early stopping

LightGBM.train(params, train_set, valid_sets: [train_set, test_set], early_stopping_rounds: 5)

CV

LightGBM.cv(params, train_set, nfold: 5, verbose_eval: true)

Scikit-Learn API

Prep your data

x = [[1, 2], [3, 4], [5, 6], [7, 8]]
y = [1, 2, 3, 4]

Train a model

model = LightGBM::Regressor.new
model.fit(x, y)

For classification, use LightGBM::Classifier

Predict

model.predict(x)

For classification, use predict_proba for probabilities

Save the model to a file

model.save_model("model.txt")

Load the model from a file

model.load_model("model.txt")

Get the importance of features

model.feature_importances

Early stopping

model.fit(x, y, eval_set: [[x_test, y_test]], early_stopping_rounds: 5)

Data

Data can be an array of arrays

[[1, 2, 3], [4, 5, 6]]

Or a Daru data frame

Daru::DataFrame.from_csv("houses.csv")

Or a Numo NArray

Numo::DFloat.new(3, 2).seq

Helpful Resources

Related Projects

  • Xgb - XGBoost for Ruby
  • Eps - Machine Learning for Ruby

Credits

Thanks to the xgboost gem for serving as an initial reference, and Selva Prabhakaran for the test datasets.

History

View the changelog

Contributing

Everyone is encouraged to help improve this project. Here are a few ways you can help:

You can’t perform that action at this time.