-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can not load pb file in tensorflow serving #491
Comments
Hi @weipf8, it looks like your SavedModel may not have a graph corresponding to the serving tag: The exported SavedModel should have tag-sets corresponding to each graph. You would have specified these when you exported or saved the SavedModel. To inspect the available tag-sets in a SavedModel, you can use the SavedModel CLI: https://www.tensorflow.org/programmers_guide/saved_model_cli |
(Resolving this issue since it should be fixed once you supply the right tag at export building time; please feel free to reopen if required). |
@sukritiramesh can you explain more? I get the following error: |
can anyone explain what this SavedModel File is in following error log ?? I was trying to use my trained model(model saved .pb format using frozengraph) for prediction in python Traceback (most recent call last): |
@yashwantptl7 Hi Yash, there is another issue(#22480) related to this. Please refer to it and post your observation there. Thank you ! |
#22480 |
Using Docker version works for me... Although I had to change .deb package version for tensorflow_model_server to 1.15.0 I followed the tutorial described here -> https://medium.com/@yuu.ishikawa/serving-pre-modeled-and-custom-tensorflow-estimator-with-tensorflow-serving-12833b4be421 Created the Dockerfile and run it... Got the same error as described in this thread... |
Hi, I have used SavedModel (Inception_resnet_v2) to export the TensorFlow model files and use TensorFlow Serving to load the files. I have directly replaced offical minst saved_model.pb with my own Inception_resnet_v2 saved_model.pb file. But I got one error.
deep@ubuntu:~/serving$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/home/deep/serving/tmp/mnist_model 2017-06-18 10:39:41.963490: I tensorflow_serving/model_servers/main.cc:146] Building single TensorFlow model file config: model_name: mnist model_base_path: home/deep/serving/tmp/mnist_model model_version_policy: 0 2017-06-18 10:39:41.963752: I tensorflow_serving/model_servers/server_core.cc:375] Adding/updating models. 2017-06-18 10:39:41.963762: I tensorflow_serving/model_servers/server_core.cc:421] (Re-)adding model: mnist 2017-06-18 10:39:42.065556: I tensorflow_serving/core/basic_manager.cc:698] Successfully reserved resources to load servable {name: mnist version: 1} 2017-06-18 10:39:42.065610: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: mnist version: 1} 2017-06-18 10:39:42.065648: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: mnist version: 1} 2017-06-18 10:39:42.065896: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:360] Attempting to load native SavedModelBundle in bundle-shim from: /home/deep/serving/tmp/mnist_model/1 2017-06-18 10:39:42.066130: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:226] Loading SavedModel from: /home/deep/serving/tmp/mnist_model/1 2017-06-18 10:39:42.080775: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:274] Loading SavedModel: fail. Took 14816 microseconds. 2017-06-18 10:39:42.080822: E tensorflow_serving/util/retrier.cc:38] Loading servable: {name: mnist version: 1} failed: Not found: Could not find meta graph def matching supplied tags.
What should I do? Thanks!
The text was updated successfully, but these errors were encountered: