Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plan to decouple TF-PS and distributed graph engine? #60

Closed
fff-2013 opened this issue Jun 11, 2020 · 1 comment
Closed

Any plan to decouple TF-PS and distributed graph engine? #60

fff-2013 opened this issue Jun 11, 2020 · 1 comment

Comments

@fff-2013
Copy link

In the current implementation,client and server of graph co-place with TF-worker and TF-parameter-server.

When I want to use one TF-worker to train and multiple workers to sample data simultaneously(for GPU training). There will be some restrictions under the current architecture. So, any plan to decouple TF-PS and distributed graph engine to make architecture more flexible?

@jackonan
Copy link
Collaborator

Good suggestion and the solution is coming.

@baoleai baoleai closed this as completed Oct 14, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants