New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HPO converts all hyperparameters into strings #1238
Comments
@wxdrizzle Can you provide a simple code example? BTW, why did you go with |
Hi @ainoam , thanks a lot for your reply! Code exampleFirst, I created a from clearml import Task
import sys
cli = sys.argv[1:]
if '--manually' in cli:
run_by_agent = False
else:
run_by_agent = True
def read_yaml(run_by_agent):
if run_by_agent:
dict_params = {}
else:
dict_params = {
'dataset/modalities': ['CT', 'MRI'],
'model/name': 'u-net',
}
return dict_params
if not run_by_agent:
task = Task.init(project_name='tmp_project', task_name='tmp_task', task_type="training")
# the following two lines are because I want the agents use my existing python environment
task.set_base_docker(docker_image='/home/xxx/software/anaconda3/envs/research')
task.set_packages([])
dict_params = read_yaml(run_by_agent)
task.set_parameters(dict_params)
else:
task = Task.init()
dict_params = task.get_parameters(cast=True)
print('run by agent?', run_by_agent)
print('dict_params: ', dict_params)
print('type of dataset/modalities', type(dict_params['dataset/modalities'])) Then I executed this file manually, by
Note that I used You can see, the type of the hyperparameter "dataset/modalities" is "list", as expected. Next, I found the ID of the generated task was "13f202cc8a014ba4b92f1e93e34352d1". Then, I created another file from clearml.automation import UniformParameterRange, UniformIntegerParameterRange, ParameterSet, DiscreteParameterRange
from clearml.automation import HyperParameterOptimizer, GridSearch, Objective
from clearml.automation.optuna import OptimizerOptuna
from clearml import Task
task = Task.init(project_name='tmp_project', task_name='hyperparam_optim', task_type=Task.TaskTypes.optimizer,
reuse_last_task_id=False)
objective_metric = Objective('test', 'dice_mean')
optimizer = GridSearch(
base_task_id='13f202cc8a014ba4b92f1e93e34352d1',
hyper_parameters=[
DiscreteParameterRange('model/name', ['resnet']),
],
objective_metric=objective_metric,
num_concurrent_workers=16,
objective_metric_title='test',
objective_metric_series='dice_mean',
objective_metric_sign='max',
execution_queue='one_gpu_work',
max_iteration_per_job=50000,
)
optimizer.start()
optimizer.wait()
optimizer.stop() Note that the Then I ran You can see, the type of the hyperparameter "dataset/modalities" changed from "list" to "str". As a current workaround, I have to manually go through all hyperparameters and use Regarding why I didn't use
|
Describe the bug
Hi, I'm using Hyperparameter Optimizer (HPO) but found the generated tasks failed because all the hyperparameters became strings rather than their original types.
To reproduce
Initially I have a completed task with hyperparameters managed by my code. Specifically, I first manually run a task, in which my code reads values from a yaml file and generate the hyperparameter dict, then use
task.set_parameters(dict_params)
.Then I started to use HPO based on this task. My code detects whether a task is run by an agent, and if so, it will use
task.get_parameters(cast=True)
to get the hyperparameters to do training. This works well if I manually clone the initial task and send it to a queue.However, when I used the HPO, the new tasks created by HPO just failed. I found that the hyperparameter values returned by
task.get_parameters(cast=True)
are all in type of str, except for the hyperparameters I specified to optimize. Is there any way to solve this issue? Thank you very much!Expected behaviour
When a task is run by the HPO, the hyperparameter values returned by
task.get_parameters(cast=True)
should have the same types as the values from the task with id ofbase_task_id
input toHyperParameterOptimizer
.Environment
The text was updated successfully, but these errors were encountered: