Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors when there is only one option for a hyperparameter #16

Closed
mattwedge opened this issue Feb 24, 2021 · 1 comment
Closed

Errors when there is only one option for a hyperparameter #16

mattwedge opened this issue Feb 24, 2021 · 1 comment
Labels
bug Something isn't working

Comments

@mattwedge
Copy link

Describe the bug
Sometimes while trying out different combinations of hyperparameters I find it useful to fix a hyperparameter to a specific value (e.g. I have determined that a specific value is better than all others). However doing this by simply providing the search_space parameter with a list of length 1 throws a value error. I could fix the values downstream but I lose some flexibility by doing so.

This issue has been observed for several different optimizers including TreeStructuredParzenEstimators, BayesianOptimizer and DecisionTreeOptimizer but not for others (HillClimbingOptimizer, ParticleSwarmOptimizer)

Code to reproduce the behavior

from hyperactive import Hyperactive, TreeStructuredParzenEstimators

search_space = {   
    "param_1": [1],
    "param_2": [0.01, 0.02, 0.03, 0.04],
}

def my_func(optimizer):
    return optimizer["param_2"]

hyper = Hyperactive()
optimizer = TreeStructuredParzenEstimators()
n_iter = 20

hyper.add_search(
  objective_function=my_func,
  search_space=search_space,
  optimizer=optimizer,
  n_iter=n_iter,
)

hyper.run()

Error message from command line
ValueError: Found array with 0 sample(s) (shape=(0, 2)) while a minimum of 1 is required.

System information:

  • OS Platform and Distribution
    WSL (Ubuntu 18.04) on Windows 10

  • Python version
    3.7

  • Hyperactive version
    3.0.4

Additional context
Full error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/hyperactive/hyperactive.py", line 199, in run
    self.results_list = run_search(self.process_infos, self.distribution)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/hyperactive/run_search.py", line 42, in run_search
    results_list = single_process(_process_, process_infos)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/hyperactive/distribution.py", line 10, in single_process
    results = [process_func(**search_processes_infos[0])]
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/hyperactive/process.py", line 34, in _process_
    nth_process=nth_process,
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/hyperactive/optimizers.py", line 160, in search
    nth_process,
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/search.py", line 146, in search
    self._iteration(nth_iter)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/times_tracker.py", line 27, in wrapper
    res = func(self, *args, **kwargs)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/search.py", line 65, in _iteration
    pos_new = self.iterate()
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/optimizers/base_optimizer.py", line 36, in wrapper
    pos = func(self, *args, **kwargs)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/optimizers/sequence_model/smbo.py", line 67, in wrapper
    pos = func(self, *args, **kwargs)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/optimizers/base_optimizer.py", line 47, in wrapper
    return func(self, *args, **kwargs)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/optimizers/sequence_model/tree_structured_parzen_estimators.py", line 82, in iterate
    return self.propose_location()
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/optimizers/sequence_model/tree_structured_parzen_estimators.py", line 70, in propose_location
    exp_imp = self.expected_improvement()
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/gradient_free_optimizers/optimizers/sequence_model/tree_structured_parzen_estimators.py", line 45, in expected_improvement
    logprob_best = self.kd_best.score_samples(self.all_pos_comb)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/sklearn/neighbors/_kde.py", line 190, in score_samples
    X = check_array(X, order='C', dtype=DTYPE)
  File "/home/wedge/anaconda3/lib/python3.7/site-packages/sklearn/utils/validation.py", line 586, in check_array
    context))
ValueError: Found array with 0 sample(s) (shape=(0, 2)) while a minimum of 1 is required.
@mattwedge mattwedge added the bug Something isn't working label Feb 24, 2021
@SimonBlanke
Copy link
Owner

Hello @mattwedge,

your detailed description of the bug helped me quickly find the problem in the smb-optimizers. Thanks for taking your time :-)
But this is a problem within the optimization backend in another repository. I will open an issue there and then we can solve this problem.

Posting in the wrong repository happens a lot here. I will write a small guide in the issues bug-template that shows if the error is part of the Gradient-Free-Optimizers repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants