Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pytorch: multi workers #85

Closed
tp-nan opened this issue Nov 17, 2021 · 1 comment
Closed

pytorch: multi workers #85

tp-nan opened this issue Nov 17, 2021 · 1 comment

Comments

@tp-nan
Copy link

tp-nan commented Nov 17, 2021

class Inference(Worker):

Hi, for multi-instances usage, by default the pytorch use default cuda stream, making it not able to run instances concurrently. So how do you solve this problem? multi-process?

@lkevinzc
Copy link
Member

Hi @ShiyangZhang , yes we use multi-process. All the concurrent requests will be queued and multiple processes will consume the queue

@mosecorg mosecorg locked and limited conversation to collaborators Nov 20, 2021
@kemingy kemingy closed this as completed Nov 20, 2021

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants