Skip to content
This repository has been archived by the owner on Nov 14, 2023. It is now read-only.

[Bug] LightGBM error while using early stopping #115

Closed
rohan-gt opened this issue Oct 6, 2020 · 2 comments
Closed

[Bug] LightGBM error while using early stopping #115

rohan-gt opened this issue Oct 6, 2020 · 2 comments

Comments

@rohan-gt
Copy link

rohan-gt commented Oct 6, 2020

I'm getting the following error while setting early_stopping=True in TuneSearchCV where estimator = LGBMClassifier(early_stopping_rounds=50) :

ValueError: For early stopping, at least one dataset and eval metric is required for evaluation

The same works fine for XGBoost. A couple of related questions:

  1. Why isn't it mandatory for the user to set early_stopping_rounds within the estimator while setting early_stopping=True?
  2. Still not sure how max_iters comes into play. When I set n_trials=10 and max_iters=10, it seems to be running 100 trials anyway. How is the early_stopping happening?
  3. Isn't it better to merge early_stopping and early_stopping_rounds like how it is done in LightGBMTunerCV?
  4. If another level of early stopping has to be implemented across trials (mentioned here), will the same ASHA scheduler work?
@richardliaw
Copy link
Collaborator

cc @Yard1 ?

@Yard1
Copy link
Member

Yard1 commented Oct 6, 2020

I think it's not something we can influence, as this is early stopping inside LightGBM itself. Our early stopping is implemented on top of early stopping that may be inside estimators themselves. Are you using the release version or the master branch? LightGBM early stopping is not present in the former IIRC. I followed the logic @inventormc implemented for xgboost, they may have a better idea about the details, but the way I understand it is that instead of refitting a model completely for each CV fold, it instead essentially fits it on all the previous folds + the current one. Therefore, it incrementally fits the estimator on a bigger and bigger portion of the dataset each time, stopping the trial if there is no improvement.

@rohan-gt rohan-gt closed this as completed Nov 9, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants