Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
-
Updated
Jul 25, 2024 - C++
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
[ICML 2019, 20 min long talk] Robust Decision Trees Against Adversarial Examples
[NeurIPS 2019] H. Chen*, H. Zhang*, S. Si, Y. Li, D. Boning and C.-J. Hsieh, Robustness Verification of Tree-based Models (*equal contribution)
Train Gradient Boosting models that are both high-performance *and* Fair!
A powerful tree-based uplift modeling system.
(i) Identify and extract mean reversion, (swing points) data points from non-stationary data, (ii) generate interpretable rules to predict such data points (iii) using supervised machine learning classification models in R such as GBM and RF.
Raspberry PI as Newtek NDI monitor
Add a description, image, and links to the gbm topic page so that developers can more easily learn about it.
To associate your repository with the gbm topic, visit your repo's landing page and select "manage topics."