pure Go implementation of prediction part for GBRT (Gradient Boosting Regression Trees) models from popular frameworks
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
benchmark [+] add kdd cup 99 benchmark to benchmark/all.sh Nov 14, 2018
internal [+] sklearn: multiclass support for classification Oct 6, 2018
mat [*] move utils to subpackage Sep 21, 2018
testdata [+] lightgbm: add breast cancer test for JSON model format Nov 17, 2018
testscripts [*] doctest.py: renamings & doc Nov 17, 2018
util [+] lightgbm: support reading model from JSON format Nov 17, 2018
.travis.yml [-] travis: exclude go master from builds Sep 25, 2018
LICENSE.md [+] LICENSE.md Sep 15, 2018
README.md [+] README.md: add JSON format mention Nov 17, 2018
doc.go [*] doc.go: add note on lightgbm dart usage Nov 14, 2018
leaves.go [!] Ensemble interface & internals refactoring Sep 25, 2018
leaves_test.go [+] lightgbm: add breast cancer test for JSON model format Nov 17, 2018
lgensemble.go [+]: support lightgbm random forest models Nov 9, 2018
lgensemble_io.go [+] lightgbm: support reading model from JSON format Nov 17, 2018
lgensemble_test.go [+] lightgbm: tests on JSON reader Nov 17, 2018
lgtree.go [*] fix lightgbm 1 leaf tree bug Nov 14, 2018
logo.png [+] logo Sep 15, 2018
skensemble_io.go [+] sklearn: multiclass support for classification Oct 6, 2018
xgblinear.go [*] change xgboost models Name() Nov 9, 2018
xgblinear_io.go [+] doc.go: add NOTE on xgboost DART usage Nov 13, 2018
xgensemble.go [+] xgboost: DART models support Nov 13, 2018
xgensemble_io.go [*] fix lightgbm 1 leaf tree bug Nov 14, 2018

README.md

leaves

Build Status GoDoc Coverage Status Go Report Card

Logo

Introduction

leaves is a library implementing prediction code for GBRT (Gradient Boosting Regression Trees) models in pure Go. The goal of the project - make it possible to use models from popular GBRT frameworks in Go programs without C API bindings.

Features

  • General Features:
    • support parallel predictions for batches
  • Support LightGBM (repo) models:
    • read models from text format and from JSON format
    • support gbdt, rf (random forest) and dart models
    • support multiclass predictions
    • addition optimizations for categorical features (for example, one hot decision rule)
    • addition optimizations exploiting only prediction usage
  • Support XGBoost (repo) models:
    • read models from binary format
    • support gbtree, gblinear, dart models
    • support multiclass predictions
    • support missing values (nan)
  • Support scikit-learn (repo) tree models (experimental support):
    • read models from pickle format (protocol 0)
    • support sklearn.ensemble.GradientBoostingClassifier

Usage examples

In order to start, go get this repository:

go get github.com/dmitryikh/leaves

Minimal example:

package main

import (
	"fmt"

	"github.com/dmitryikh/leaves"
)

func main() {
	// 1. Read model
	model, err := leaves.LGEnsembleFromFile("lightgbm_model.txt")
	if err != nil {
		panic(err)
	}

	// 2. Do predictions!
	fvals := []float64{1.0, 2.0, 3.0}
	p := model.PredictSingle(fvals, 0)
	fmt.Printf("Prediction for %v: %f\n", fvals, p)
}

In order to use XGBoost model, just change leaves.LGEnsembleFromFile, to leaves.XGEnsembleFromFile.

Documentation

Documentation is hosted on godoc (link). Documentation contains complex usage examples and full API reference. Some additional information about usage examples can be found in leaves_test.go.

Benchmark

Below are comparisons of prediction speed on batches (~1000 objects in 1 API call). Hardware: MacBook Pro (15-inch, 2017), 2,9 GHz Intel Core i7, 16 ГБ 2133 MHz LPDDR3. C API implementations were called from python bindings. But large batch size should neglect overhead of python bindings. leaves benchmarks were run by means of golang test framework: go test -bench. See benchmark for mode details on measurments. See testdata/README.md for data preparation pipelines.

Single thread:

Test Case Features Trees Batch size C API leaves
LightGBM MS LTR 137 500 1000 49ms 51ms
LightGBM Higgs 28 500 1000 50ms 50ms
LightGBM KDD Cup 99* 41 1200 1000 70ms 85ms
XGBoost Higgs 28 500 1000 44ms 50ms

4 threads:

Test Case Features Trees Batch size C API leaves
LightGBM MS LTR 137 500 1000 14ms 14ms
LightGBM Higgs 28 500 1000 14ms 14ms
LightGBM KDD Cup 99* 41 1200 1000 19ms 24ms
XGBoost Higgs 28 500 1000 ? 14ms

(?) - currenly I'm unable to utilize multithreading form XGBoost predictions by means of python bindings

(*) - KDD Cup 99 problem involves continuous and categorical features simultaneously

Limitations

  • LightGBM models:
    • no support transformations functions (sigmoid, lambdarank, etc). Output scores is raw scores
  • XGBoost models:
    • no support transformations functions. Output scores is raw scores
    • could be slight divergence between C API predictions vs. leaves because of floating point convertions and comparisons tolerances
  • scikit-learn tree models:
    • no support transformations functions. Output scores is raw scores (as from GradientBoostingClassifier.decision_function)
    • only pickle protocol 0 is supported
    • could be slight divergence between sklearn predictions vs. leaves because of floating point convertions and comparisons tolerances

Contacts

In case if you are interested in the project or if you have questions, please contact with me by email: khdmitryi at gmail.com