Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use multi-gpus? #54

Closed
maxmarketit opened this issue Jun 9, 2020 · 2 comments
Closed

How to use multi-gpus? #54

maxmarketit opened this issue Jun 9, 2020 · 2 comments

Comments

@maxmarketit
Copy link

Thank you for the promising package.

I wonder how I can use multi-gpus.

I tinkered with num_workers and torch_num_threads but nothing seemed to be changed.

( I wish I could have a document about configuration ).

So is there any way to use multiple gpus? or set the default gpu to use other than 0?

@LMZimmer
Copy link
Contributor

There are a few ways of using parallelism:

i) Multiple workers: Start Auto-PyTorch in multiple threads, where all have the same run_id and different task_id. The worker with task_id=1 will be the master. This is done in the ensemble example.

ii) For image data when a thread has more than one GPU it will automatically apply PyTorchs nn.DataParallel.

iii) Setting the number of threads for data loading in the autonet config will set the corresponding value in nn.DataLoader.

@astro-kevin
Copy link

The Ensemble example link is broken. Is there an equivalent in the current examples?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants