New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: The value argument must be within the support #2
Comments
This is a rather frequent issue if I only have one option for a specific hyperparameter. Sometimes, I just want to fix the optimizer of a neural network to be "Adam". It will raise this exception. |
@hujiaxin0 @zhenlingcn Can you provide me a minimal code to reproduce this issue? |
Why not utilize the "fix_input"? It is working. |
@ywran Thanks! This is a solution to solve my problem. However, it seems that this is not the root cause of this problem. From the exception stack, it seems that this issue has a relationship with PyTorch. |
Hi @zhenlingcn I write the below code trying to reproduce your issue, but it seems that the code works well without crashing? import numpy as np
import pandas as pd
from hebo.design_space import DesignSpace
from hebo.optimizers.hebo import HEBO
def dummy_obj(para : pd.DataFrame) -> np.ndarray:
return np.ones((para.shape[0], 1))
if __name__ == '__main__':
space = DesignSpace().parse([{'name' : 'optimizer', 'type' : 'cat', 'categories' : ['adam']}])
opt = HEBO(space)
for i in range(10):
rec = opt.suggest()
obs = dummy_obj(rec)
opt.observe(rec, obs)
print(obs) |
@Alaya-in-Matrix Thanks for your response. I find this problem occurs when using HEBO v0.1, but this problem disappears if I install the latest version of HEBO. |
@zhenlingcn Nice! |
Changing Hebo(space) to Hebo(space, model_name='gp') can solve this problem. |
2021-11-01 08:02:44.622 ERROR Traceback (most recent call last):
File "/home/ma-user/work/automl-1.8_EI/vega/core/pipeline/pipeline.py", line 79, in run
pipestep.do()
File "/home/ma-user/work/automl-1.8_EI/vega/core/pipeline/search_pipe_step.py", line 55, in do
self._dispatch_trainer(res)
File "/home/ma-user/work/automl-1.8_EI/vega/core/pipeline/search_pipe_step.py", line 73, in _dispatch_trainer
self.master.run(trainer, evaluator)
File "/home/ma-user/work/automl-1.8_EI/vega/core/scheduler/local_master.py", line 63, in run
self._update(step_name, worker_id)
File "/home/ma-user/work/automl-1.8_EI/vega/core/scheduler/local_master.py", line 71, in _update
self.update_func(step_name, worker_id)
File "/home/ma-user/work/automl-1.8_EI/vega/core/pipeline/generator.py", line 131, in update
self.search_alg.update(record.serialize())
File "/home/ma-user/work/automl-1.8_EI/vega/algorithms/hpo/hpo_base.py", line 84, in update
self.hpo.add_score(config_id, int(rung_id), rewards)
File "/home/ma-user/work/automl-1.8_EI/vega/algorithms/hpo/sha_base/boss.py", line 230, in add_score
self._set_next_ssa()
File "/home/ma-user/work/automl-1.8_EI/vega/algorithms/hpo/sha_base/boss.py", line 159, in _set_next_ssa
configs = self.tuner.propose(self.iter_list[iter])
File "/home/ma-user/work/automl-1.8_EI/vega/algorithms/hpo/sha_base/hebo_adaptor.py", line 70, in propose
suggestions = self.hebo.suggest(n_suggestions=num)
File "/home/ma-user/work/model_zoo/HEBO-master/HEBO/hebo/optimizers/hebo.py", line 126, in suggest
rec = opt.optimize(initial_suggest = best_x, fix_input = fix_input).drop_duplicates()
File "/home/ma-user/work/model_zoo/HEBO-master/HEBO/hebo/acq_optimizers/evolution_optimizer.py", line 122, in optimize
print("optimize: ", prob, algo,self.iter)
File "/home/ma-user/miniconda3/envs/MindSpore-python3.7-aarch64/lib/python3.7/site-packages/pymoo/model/problem.py", line 448, in str
s += "# f(xl): %s\n" % self.evaluate(self.xl)[0]
File "/home/ma-user/miniconda3/envs/MindSpore-python3.7-aarch64/lib/python3.7/site-packages/pymoo/model/problem.py", line 267, in evaluate
out = self._evaluate_batch(X, calc_gradient, out, *args, **kwargs)
File "/home/ma-user/miniconda3/envs/MindSpore-python3.7-aarch64/lib/python3.7/site-packages/pymoo/model/problem.py", line 335, in _evaluate_batch
self._evaluate(X, out, *args, **kwargs)
File "/home/ma-user/work/model_zoo/HEBO-master/HEBO/hebo/acq_optimizers/evolution_optimizer.py", line 50, in _evaluate
acq_eval = self.acq(xcont, xenum).numpy().reshape(num_x, self.acq.num_obj + self.acq.num_constr)
File "/home/ma-user/work/model_zoo/HEBO-master/HEBO/hebo/acquisitions/acq.py", line 39, in call
return self.eval(x, xe)
File "/home/ma-user/work/model_zoo/HEBO-master/HEBO/hebo/acquisitions/acq.py", line 157, in eval
log_phi = dist.log_prob(normed)
File "/home/ma-user/miniconda3/envs/MindSpore-python3.7-aarch64/lib/python3.7/site-packages/torch/distributions/normal.py", line 73, in log_prob
self._validate_sample(value)
File "/home/ma-user/miniconda3/envs/MindSpore-python3.7-aarch64/lib/python3.7/site-packages/torch/distributions/distribution.py", line 277, in _validate_sample
raise ValueError('The value argument must be within the support')
ValueError: The value argument must be within the support
The text was updated successfully, but these errors were encountered: