Selection of features of major gradient boosting implementations XGBoost, LightGBM, and CatBoost for Python and R.
Aspect | XGBoost | LightGBM | CatBoost |
---|---|---|---|
Speed: CPU | 💕 | ||
Speed: GPU | ❤️ | 💕 | |
Standard losses | ✔️ | ✔️ | ✔️ |
Special loss: Poisson/Gamma/Tweedie | ✔️ | ✔️ | ✔️ |
Special loss: Survival | ✔️ | ||
Special loss: Robust | ✔️ | ✔️ | |
Special loss: Quantile | ✔️ | ✔️ | |
Tree size regularization | ✔️ | ✔️ | ✔️ |
Categorical input handling | ❤️ | 💕 | |
Constraints: monotonic | ✔️ | ✔️ | ✔️ |
Constraints: interaction | ✔️ | ✔️ | |
Case weights | ✔️ | ✔️ | ✔️ |
Missing values | ✔️ | ✔️ | ✔️ |
Interpretation: Importance | ✔️ | ✔️ | ✔️ |
Interpretation: SHAP | ✔️ | ✔️ | ✔️ |
Cross-validation | ✔️ | ✔️ | Python only |
Special mode: Random Forest | ✔️ | ✔️ | |
Special mode: Linear booster | ✔️ | ✔️ | |
Installation easy | ✔️ | ✔️ | ✔️ |
Initial public release | 2014 | 2016 | 2017 |
This compilation as per Oct 10, 2022 is neither complete nor does it claim to be correct.
- Update as per July 5, 2020: LightGBM has implemented interaction constraints, hurray :-).
- Update as per Sept. 25, 2020: LightGBM is on CRAN, hurray again!
- Fixed initial release year of LGB