Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calling model.predict() from multiple processes #1037

Closed
Rampagy opened this issue Apr 17, 2018 · 3 comments
Closed

Calling model.predict() from multiple processes #1037

Rampagy opened this issue Apr 17, 2018 · 3 comments

Comments

@Rampagy
Copy link

Rampagy commented Apr 17, 2018

Maybe this is an issue with my implementation, tflearn or tensorflow, I guess I am not sure. The code hangs when I call model.predict() in multiple processes, but I know the exact same code works when evaluated in a single process. See my reproduction steps here.

I know I have seen lots of issues with tensorflow hanging when trying to predict within multiple processes, so I am not sure if this is a tflearn issue or a tensorflow issue.

@ckyleda
Copy link

ckyleda commented Apr 17, 2018

Can confirm that this is a significant issue that's existed for literally as long as Tensorflow has - I don't understand how this still hasn't been fixed yet.

There seems to be no way to multiprocess inference in Tensorflow. (At least as far as I have found.)

@Rampagy
Copy link
Author

Rampagy commented Apr 17, 2018

Dang. Thanks for looking into it.

@Rampagy Rampagy closed this as completed Apr 17, 2018
@Lancerchiang
Copy link

Lancerchiang commented Apr 23, 2018

@Rampagy Hi please refer to the solution I wrote for this issue. Multiprocessing works fine with tensorflow.
#tensorflow/tensorflow#8220

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants