Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question/Documentation] Is it possible to run an experiment with trials distributed across GPUs #66

Closed
gatapia opened this issue May 2, 2019 · 4 comments
Assignees
Labels
enhancement New feature or request fixready Fix has landed on master. question Further information is requested

Comments

@gatapia
Copy link

gatapia commented May 2, 2019

Tool looks great! But just wondering how you would go about running an experiment with trials distributed across GPUs (on a single machine). I am looking at the Service API / Developer API pages but cannot see how a client/server or queue structure would work (have not dug through code yet).

I'm after something like Ray to do optimisation.

I think distributed experiments is a pretty important feature so I'm assuming it has to be there somewhere, a tutorial would be great. I'm interested on single host / multi GPU environment but I'm sure multi-host would also be of value to people.

@kkashin
Copy link
Contributor

kkashin commented May 2, 2019

Thanks for raising this issue - we definitely agree that distributed experimentation is a really important feature and there's actually an integration with Ray in-flight. See ray-project/ray#4731 for the initial PR. There are still a couple of tweaks to make the parallelism work, but this should be available soon (hopefully by end of next week).

After this is available, we'll be sure to update the documentation to explicitly mention how to run experiments in a distributed manner.

@kkashin kkashin added enhancement New feature or request question Further information is requested labels May 2, 2019
@richardliaw
Copy link
Contributor

BTW @gatapia, check out this tutorial notebook for using Ax and Tune. You'll have to add a ray.init line to connect to your Ray cluster. We'll also push this tutorial onto the Ray docs.

@lena-kashtelyan
Copy link
Contributor

RayTune tutorial is now ready and checked in on master.

@lena-kashtelyan lena-kashtelyan added the fixready Fix has landed on master. label Jul 3, 2019
@gatapia
Copy link
Author

gatapia commented Jul 4, 2019

great tutorial. feel free to close.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request fixready Fix has landed on master. question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants