-
Notifications
You must be signed in to change notification settings - Fork 52
[Feature request] Early stopping doesn't work with XGBoost, LightGBM or CatBoost #58
Comments
Yeah, that's a good point. I think we'll probably want to special-case a training call for xgboost, lightgbm as they have a different way of doing early stopping.. |
@richardliaw I've updated the issue with more details. Btw how does |
yeah... i guess that's the penalty we have to pay for adhering to the sklearn API. max_iters = number of "epochs" Does that make sense? |
In #63, we're going to enable early stopping for XGBoost via incremental learning. We decided not to implement it for lgbm because it is not yet on a stable version. |
Hmm, not sure how we're going to support CatBoost but will open an issue to track lightgbm. |
@richardliaw is early stopping enabled using cross-validation like I mentioned here? Because CatBoost has a |
I'm getting the following error while setting
early_stopping=True
These could be a potential fixes:
fit()
multiple times on the same model and stay the previous fitted stuff like thepartial_fit()
in somesklearn
classifiers. microsoft/LightGBM#2718The text was updated successfully, but these errors were encountered: