Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distributed training examples? #4

Closed
miriaford opened this issue Jun 2, 2020 · 4 comments
Closed

Distributed training examples? #4

miriaford opened this issue Jun 2, 2020 · 4 comments

Comments

@miriaford
Copy link

It looks like all of the scripts in the examples are single-process. Are there any examples for distributed training?
The paper mentions "Launchpad" that manages distributed processes - when do you plan to open-source that?

@bshahr
Copy link
Collaborator

bshahr commented Jun 3, 2020

Hi miriaford!
We have only open-sourced our single-process agents. While the code that is run in our distributed agents is identical to the open-sourced agents, the former are indeed tied to Launchpad and other DeepMind infrastructure. We don’t currently have a timetable for releasing these components. We’ll update the docs accordingly. Thanks for your question and interest in Acme!

@tomhennigan
Copy link
Contributor

FYI Launchpad has been released (https://github.com/deepmind/launchpad, https://deepmind.com/research/open-source/launchpad) and distributed acme agents using launchpad were added in 3bc0426.

@davidireland3
Copy link

@tomhennigan is there an example anywhere for using acme with distributed training now launchpad has been released? I'm currently using ray and it feels slower than it should (most of the overhead comes from passing the data from the server it's stored in, to the learner).

@tomhennigan
Copy link
Contributor

Hi @davidireland3, I'm not aware of any (however I haven't worked much on acme). @bshahr are you aware of any examples?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants