You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"batch_size": {"value": [16, 32], "default": 48, "desc": "Total batch size for training for single GPU v100. If using multiGPU, the total batch size will be automatically adjusted."},
Which lead to the following, somehow odd, error:
Traceback (most recent call last):
File "run_nohate_experiments.py", line 16, in <module>
main()
File "run_nohate_experiments.py", line 12, in main
run_experiment(experiment)
File "/workspace/FARM/farm/experiment.py", line 79, in run_experiment
processor=processor, batch_size=args.batch_size, distributed=distributed
File "/workspace/FARM/farm/data_handler/data_silo.py", line 39, in __init__
self._load_data()
File "/workspace/FARM/farm/data_handler/data_silo.py", line 66, in _load_data
self._initialize_data_loaders()
File "/workspace/FARM/farm/data_handler/data_silo.py", line 79, in _initialize_data_loaders
tensor_names=self.tensor_names,
File "/workspace/FARM/farm/data_handler/dataloader.py", line 49, in __init__
collate_fn=collate_fn,
File "/opt/conda/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 179, in __init__
batch_sampler = BatchSampler(sampler, batch_size, drop_last)
File "/opt/conda/lib/python3.6/site-packages/torch/utils/data/sampler.py", line 162, in __init__
"but got batch_size={}".format(batch_size))
ValueError: batch_size should be a positive integer value, but got batch_size=16
The text was updated successfully, but these errors were encountered:
The config was like this:
Which lead to the following, somehow odd, error:
The text was updated successfully, but these errors were encountered: