New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: is it possible to use early stopping of Lightbm, Catboost? #251
Comments
Hi @benitocm, |
Hi, When I have done from scratch, i have used timesSeriesSplit from sckilearn to enforce time constraints. In the case of HistogramGradientboosting, in the cross_val_score with a CV of TimeSeriesSplit expecting that the time constraints hold. Do you think those approaches are not correct? If so, please i would appreciate your inputs. In the case of darts library, a validation series can be provided to the fit method. Would something like this the only way to use early stopping? Thanks very much for your time Something related to this unit8co/darts#1154 |
If the time constraints holds, I think there is no problem using it. Could you add a small example so we can double check if the approach is correct? |
Hi, In the case of HistogramGradientBoosting, I am using the
In the case of CatBoost, I am using the parameter
|
Hi again, Maybe I made not myself clear. My goal of being able to use early stopping is taking advantage of it to guess a reasonable number of trees. In the case of Catboost (according to this), when you set up the number of trees, the algorithm itself selects a learning rate that would be very close to the optimal one. That makes easier that tuning (that is not the case with the other gbt algos) Thanks very much |
Hi,
i am using gbt algoritms as base regressor for the forecaster. I am interested in using the early stopping feature of those kind algos. Is it possible?
in the case of HistogramGradientboosting i think is easier becuse the early stopping is configures differently.
Thank you in advance
The text was updated successfully, but these errors were encountered: