Skip to content
This repository has been archived by the owner on Nov 14, 2023. It is now read-only.

[Bug] ConfigurationSpace throws error with BOHB #70

Closed
rohan-gt opened this issue Aug 19, 2020 · 4 comments · Fixed by #76
Closed

[Bug] ConfigurationSpace throws error with BOHB #70

rohan-gt opened this issue Aug 19, 2020 · 4 comments · Fixed by #76

Comments

@rohan-gt
Copy link

rohan-gt commented Aug 19, 2020

Passing a ConfigurationSpace after setting search_optimization='bohb' in TuneSearchCV throws the following error:

/usr/local/lib/python3.6/dist-packages/tune_sklearn/tune_basesearch.py in fit(self, X, y, groups, **fit_params)
    383                               "To show process output, set verbose=2.")
    384 
--> 385             result = self._fit(X, y, groups, **fit_params)
    386 
    387             if not ray_init and ray.is_initialized():

/usr/local/lib/python3.6/dist-packages/tune_sklearn/tune_basesearch.py in _fit(self, X, y, groups, **fit_params)
    331         config["n_jobs"] = self.sk_n_jobs
    332 
--> 333         self._fill_config_hyperparam(config)
    334         analysis = self._tune_run(config, resources_per_trial)
    335 

/usr/local/lib/python3.6/dist-packages/tune_sklearn/tune_search.py in _fill_config_hyperparam(self, config)
    336         samples = 1
    337         all_lists = True
--> 338         for key, distribution in self.param_distributions.items():
    339             if isinstance(distribution, list):
    340                 import random

AttributeError: 'ConfigurationSpace' object has no attribute 'items'
@rohan-gt
Copy link
Author

The _get_bohb_config_space() function doesn't support integer spaces using UniformIntegerHyperparameter

@Yard1
Copy link
Member

Yard1 commented Aug 26, 2020

@rohan-gt Pass a list of CS hyperparameters instead. I'll think about fixing this later

@arainboldt
Copy link

Getting the same error using tune-sklearn version 0.2.1


Param Space: 
Configuration space object:
  Hyperparameters:
    boosting_type, Type: Categorical, Choices: {gbdt, dart, rf}, Default: gbdt
    colsample_bytree, Type: UniformFloat, Range: [0.3, 0.7], Default: 0.5
    learning_rate, Type: UniformFloat, Range: [1e-10, 0.1], Default: 0.05
    max_depth, Type: UniformInteger, Range: [1, 10], Default: 6
    n_estimators, Type: UniformInteger, Range: [10, 200], Default: 105
    num_leaves, Type: UniformInteger, Range: [10, 100], Default: 55
    reg_alpha, Type: UniformFloat, Range: [0.0, 1.0], Default: 0.5
    reg_lambda, Type: UniformFloat, Range: [0.0, 1.0], Default: 0.5
    subsample, Type: UniformFloat, Range: [0.3, 0.8], Default: 0.55
    subsample_freq, Type: UniformInteger, Range: [3, 7], Default: 5

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-42-5be89034031c> in <module>
     35 base_est = LGBMClassifier(class_weight='balanced',random_state=42,n_jobs=-2,silent=True,objective='multiclass')
     36 
---> 37 model = param_search(base_est,gen_config_space(lgbm_config_dists),train_data,y)
     38 full_glb_preds = pd.DataFrame(model.predict_proba(pred_data),index=pred_data.index )\
     39                              .rename(codes_to_labels,axis=1)

<ipython-input-41-35f8c014e1bd> in param_search(estimator, param_space, X, y, scorer, n_iters, n_folds, search_method, model_name)
     93 
     94 
---> 95     param_model.fit(X,y)
     96 
     97     print(f'Best Model Score: {param_model.score(X,y)}')

~/lib/python3.8/site-packages/tune_sklearn/tune_basesearch.py in fit(self, X, y, groups, **fit_params)
    662                                     "To show process output, set verbose=2.")
    663 
--> 664             result = self._fit(X, y, groups, **fit_params)
    665 
    666             if not ray_init and ray.is_initialized():

~/lib/python3.8/site-packages/tune_sklearn/tune_basesearch.py in _fit(self, X, y, groups, **fit_params)
    563 
    564         self._fill_config_hyperparam(config)
--> 565         analysis = self._tune_run(config, resources_per_trial)
    566 
    567         self.cv_results_ = self._format_results(self.n_splits, analysis)

~/lib/python3.8/site-packages/tune_sklearn/tune_search.py in _tune_run(self, config, resources_per_trial)
    652             search_space = None
    653             override_search_space = True
--> 654             if self._is_param_distributions_all_tune_domains():
    655                 run_args["config"].update(self.param_distributions)
    656                 override_search_space = False

~/lib/python3.8/site-packages/tune_sklearn/tune_search.py in _is_param_distributions_all_tune_domains(self)
    440     def _is_param_distributions_all_tune_domains(self):
    441         return all(
--> 442             isinstance(v, Domain) for k, v in self.param_distributions.items())
    443 
    444     def _get_bohb_config_space(self):

AttributeError: 'ConfigurationSpace' object has no attribute 'items'


Getting the below error using the suggestion of @Yard1 to pass a list of cs hyperparams instead of the configspace object:

Param Space: [boosting_type, Type: Categorical, Choices: {gbdt, dart, rf}, Default: gbdt, num_leaves, Type: UniformInteger, Range: [10, 100], Default: 55, max_depth, Type: UniformInteger, Range: [1, 10], Default: 6, learning_rate, Type: UniformFloat, Range: [1e-10, 0.1], Default: 0.05, n_estimators, Type: UniformInteger, Range: [10, 200], Default: 105, subsample, Type: UniformFloat, Range: [0.3, 0.8], Default: 0.55, subsample_freq, Type: UniformInteger, Range: [3, 7], Default: 5, colsample_bytree, Type: UniformFloat, Range: [0.3, 0.7], Default: 0.5, reg_alpha, Type: UniformFloat, Range: [0.0, 1.0], Default: 0.5, reg_lambda, Type: UniformFloat, Range: [0.0, 1.0], Default: 0.5]

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-38-5be89034031c> in <module>
     35 base_est = LGBMClassifier(class_weight='balanced',random_state=42,n_jobs=-2,silent=True,objective='multiclass')
     36 
---> 37 model = param_search(base_est,gen_config_space(lgbm_config_dists),train_data,y)
     38 full_glb_preds = pd.DataFrame(model.predict_proba(pred_data),index=pred_data.index )\
     39                              .rename(codes_to_labels,axis=1)

<ipython-input-37-e819165f716b> in param_search(estimator, param_space, X, y, scorer, n_iters, n_folds, search_method, model_name)
     71 
     72 
---> 73     param_model = TuneSearchCV(estimator, 
     74                       param_space,
     75                       early_stopping=True,

~/lib/python3.8/site-packages/tune_sklearn/tune_search.py in __init__(self, estimator, param_distributions, early_stopping, n_trials, scoring, n_jobs, refit, cv, verbose, random_state, error_score, return_train_score, local_dir, max_iters, search_optimization, use_gpu, loggers, pipeline_auto_early_stop, stopper, time_budget_s, sk_n_jobs, **search_kwargs)
    323         if isinstance(param_distributions, list):
    324             if search_optimization != "random":
--> 325                 raise ValueError("list of dictionaries for parameters "
    326                                  "is not supported for non-random search")
    327 

ValueError: list of dictionaries for parameters is not supported for non-random search

@Yard1
Copy link
Member

Yard1 commented May 25, 2021

@arainboldt Thanks for the report, this will be fixed. In the meantime, can you pass a dictionary of Ray Tune hyperparameters? Also, please update to 0.3.0

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants