New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error exception when load keras model deep learning by celery #4172
Comments
@trangtv57 Can you post some logs or explain what error is there if any ? Load the model in our main celery app and use it everywhere else required. All workers will have separate object.s |
Sorry if my question have some confuse. I have object graph: is tensorflow.get_default_graph(). this object need share separate on all threads like this: In Celery I run with mode debugging but nothing so error ( note: I have same problem (with nothing error show, its just stuck when running) when I run load my model on multi process, so I can understand why celery don't show any thing and just stuck).
` AND File worker thread: `
` |
The issue is A workaround is that all imports to Tensorflow (including keras) occur only in the child, spawned processes. This has allowed me to effectively load and utilize serialized TensorFlow models across child process using keras. see tensorflow/tensorflow#5448 |
closing as not a bug, if any documentation or code fixes are suggested plz send a pr |
Checklist
celery -A proj report
in the issue.(if you are not able to do this, then at least specify the Celery
version affected).
master
branch of Celery.Steps to reproduce
I want to load model i have trained it before in keras by celery. But in tensorflow( backend of keras) its just accept for load only 1 model on same session( can say is same thread). So problem here is, how can i saparate each thread in celery (can just how to make each thread have it's own each object model keras) to can run.
Expected behavior
Can run api of deep learning( keras) with celery
Actual behavior
The text was updated successfully, but these errors were encountered: