Skip to content

mayer79/gradient_boosting_comparison

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 

Repository files navigation

Features of Major Gradient Boosting Implementations

Selection of features of major gradient boosting implementations XGBoost, LightGBM, and CatBoost for Python and R.

Aspect XGBoost LightGBM CatBoost
Speed: CPU 💕
Speed: GPU ❤️ 💕
Standard losses ✔️ ✔️ ✔️
Special loss: Poisson/Gamma/Tweedie ✔️ ✔️ ✔️
Special loss: Survival ✔️
Special loss: Robust ✔️ ✔️
Special loss: Quantile ✔️ ✔️
Tree size regularization ✔️ ✔️ ✔️
Categorical input handling ❤️ 💕
Constraints: monotonic ✔️ ✔️ ✔️
Constraints: interaction ✔️ ✔️
Case weights ✔️ ✔️ ✔️
Missing values ✔️ ✔️ ✔️
Interpretation: Importance ✔️ ✔️ ✔️
Interpretation: SHAP ✔️ ✔️ ✔️
Cross-validation ✔️ ✔️ Python only
Special mode: Random Forest ✔️ ✔️
Special mode: Linear booster ✔️ ✔️
Installation easy ✔️ ✔️ ✔️
Initial public release 2014 2016 2017

This compilation as per Oct 10, 2022 is neither complete nor does it claim to be correct.

  • Update as per July 5, 2020: LightGBM has implemented interaction constraints, hurray :-).
  • Update as per Sept. 25, 2020: LightGBM is on CRAN, hurray again!
  • Fixed initial release year of LGB

About

Comparison of major gradient boosting implementations

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published