You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i) Multiple workers: Start Auto-PyTorch in multiple threads, where all have the same run_id and different task_id. The worker with task_id=1 will be the master. This is done in the ensemble example.
ii) For image data when a thread has more than one GPU it will automatically apply PyTorchs nn.DataParallel.
iii) Setting the number of threads for data loading in the autonet config will set the corresponding value in nn.DataLoader.
Thank you for the promising package.
I wonder how I can use multi-gpus.
I tinkered with
num_workers
andtorch_num_threads
but nothing seemed to be changed.( I wish I could have a document about configuration ).
So is there any way to use multiple gpus? or set the default gpu to use other than 0?
The text was updated successfully, but these errors were encountered: