Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

grid search with per_float_feature_quantization exports parameters wrong #1833

Closed
acrofales opened this issue Sep 2, 2021 · 0 comments
Closed
Assignees

Comments

@acrofales
Copy link

acrofales commented Sep 2, 2021

Problem:
when using per_float_feature_quantization as one of the hyperparameters to optimize in grid_search, the parameter is exported wrong:
{'params': {'depth': 6, 'verbose': 200, 'l2_leaf_reg': 1, 'iterations': 500, 'learning_rate': 0.1, 'per_float_feature_quantization': [0.0]},
This causes a problem if running the grid_search with refit=True. However, it is similarly problematic if you extract the params and try to refit manually (or do anything else with the optimal hyperparameters).

A workaround is to inspect the catboost_training.json file and find the run with the best performance, then copy the parameters from there. However, this is quite tedious.

catboost version: 0.26
Operating System: linux
CPU: Tried on a local laptop Intel Core i5, as well as a google compute engine n1-standard-4

E: I switched to sklearn GridSearchCV instead, and then I get an error related to the fitting which was probably suppressed in the catboost gridsearch that I cannot use None for per_float_feature_quantization. Perhaps this is related?
In which case, I guess a related issue is: how can I include None as a potential value for per_float_feature_quantization in a grid search? I tried using an empty string, but that isn't accepted either.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants