New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
User defined callbacks are not supported for GPU #1792
Comments
Ran into this same issue with catboost v1.0 This makes writing code that works with both CPU and GPU very difficult, particularly when I am trying to guarantee that catboost returns within a given time limit (via callbacks). |
Hello ✋ do you have any idea if callbacks would be supported in the future in catboost? thanks! |
@gaceladri ❤️ |
@Evgueni-Petrov-aka-espetrov That is amazing to hear!! |
Any updates on this? It would be great to have pruning callbacks while training n GPU |
this is paused currently |
Would be really nice to see this feature in the next versions of catboost |
Any chance this is ever going to happen? What is the workaround? |
a BS graduating this summer in SE has this a his thesis topic formally speaking, workaround is using cpu |
Problem:
I trained a model on GPU for a dataset. However,when i use clf.fit(X,y,callbacks=[mycallback1], i got an error as below:
<_catboost.CatBoostError: User defined loss functions, metrics and callbacks are not supported for GPU>
. So how can i use my own callbacks when i train with GPU? Looking forward to your reply.catboost version: 0.26
The text was updated successfully, but these errors were encountered: