Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when running by config with a list of batch sizes #38

Closed
tripl3a opened this issue Aug 3, 2019 · 1 comment
Closed

Error when running by config with a list of batch sizes #38

tripl3a opened this issue Aug 3, 2019 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@tripl3a
Copy link
Contributor

tripl3a commented Aug 3, 2019

The config was like this:

"batch_size":                   {"value": [16, 32], "default": 48,  "desc": "Total batch size for training for single GPU v100. If using multiGPU, the total batch size will be automatically adjusted."},

Which lead to the following, somehow odd, error:

Traceback (most recent call last):
  File "run_nohate_experiments.py", line 16, in <module>
    main()
  File "run_nohate_experiments.py", line 12, in main
    run_experiment(experiment)
  File "/workspace/FARM/farm/experiment.py", line 79, in run_experiment
    processor=processor, batch_size=args.batch_size, distributed=distributed
  File "/workspace/FARM/farm/data_handler/data_silo.py", line 39, in __init__
    self._load_data()
  File "/workspace/FARM/farm/data_handler/data_silo.py", line 66, in _load_data
    self._initialize_data_loaders()
  File "/workspace/FARM/farm/data_handler/data_silo.py", line 79, in _initialize_data_loaders
    tensor_names=self.tensor_names,
  File "/workspace/FARM/farm/data_handler/dataloader.py", line 49, in __init__
    collate_fn=collate_fn,
  File "/opt/conda/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 179, in __init__
    batch_sampler = BatchSampler(sampler, batch_size, drop_last)
  File "/opt/conda/lib/python3.6/site-packages/torch/utils/data/sampler.py", line 162, in __init__
    "but got batch_size={}".format(batch_size))
ValueError: batch_size should be a positive integer value, but got batch_size=16
@tholor tholor added the bug Something isn't working label Aug 5, 2019
tanaysoni added a commit that referenced this issue Aug 5, 2019
The lists in configs are converted to numpy arrays. The batch_size
argument is now type casted to the Python `int` type.
@tanaysoni
Copy link
Contributor

Hi @tripl3a, thank you for reporting the issue. It was a type error fixed by #41.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants