Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can not load pb file in tensorflow serving #491

Closed
ghost opened this issue Jun 18, 2017 · 7 comments
Closed

Can not load pb file in tensorflow serving #491

ghost opened this issue Jun 18, 2017 · 7 comments

Comments

@ghost
Copy link

ghost commented Jun 18, 2017

Hi, I have used SavedModel (Inception_resnet_v2) to export the TensorFlow model files and use TensorFlow Serving to load the files. I have directly replaced offical minst saved_model.pb with my own Inception_resnet_v2 saved_model.pb file. But I got one error.
deep@ubuntu:~/serving$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/home/deep/serving/tmp/mnist_model 2017-06-18 10:39:41.963490: I tensorflow_serving/model_servers/main.cc:146] Building single TensorFlow model file config: model_name: mnist model_base_path: home/deep/serving/tmp/mnist_model model_version_policy: 0 2017-06-18 10:39:41.963752: I tensorflow_serving/model_servers/server_core.cc:375] Adding/updating models. 2017-06-18 10:39:41.963762: I tensorflow_serving/model_servers/server_core.cc:421] (Re-)adding model: mnist 2017-06-18 10:39:42.065556: I tensorflow_serving/core/basic_manager.cc:698] Successfully reserved resources to load servable {name: mnist version: 1} 2017-06-18 10:39:42.065610: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: mnist version: 1} 2017-06-18 10:39:42.065648: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: mnist version: 1} 2017-06-18 10:39:42.065896: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:360] Attempting to load native SavedModelBundle in bundle-shim from: /home/deep/serving/tmp/mnist_model/1 2017-06-18 10:39:42.066130: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:226] Loading SavedModel from: /home/deep/serving/tmp/mnist_model/1 2017-06-18 10:39:42.080775: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:274] Loading SavedModel: fail. Took 14816 microseconds. 2017-06-18 10:39:42.080822: E tensorflow_serving/util/retrier.cc:38] Loading servable: {name: mnist version: 1} failed: Not found: Could not find meta graph def matching supplied tags.
What should I do? Thanks!

@sukritiramesh
Copy link
Contributor

Hi @weipf8, it looks like your SavedModel may not have a graph corresponding to the serving tag:
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/tag_constants.py#L26

The exported SavedModel should have tag-sets corresponding to each graph. You would have specified these when you exported or saved the SavedModel. To inspect the available tag-sets in a SavedModel, you can use the SavedModel CLI: https://www.tensorflow.org/programmers_guide/saved_model_cli

@sukritiramesh
Copy link
Contributor

(Resolving this issue since it should be fixed once you supply the right tag at export building time; please feel free to reopen if required).

@avielas
Copy link

avielas commented May 11, 2018

@sukritiramesh can you explain more? I get the following error:
2018-05-11 20:14:35.819721: F tensorflow/contrib/lite/toco/toco_saved_model.cc:50] Non-OK-status: tensorflow::LoadSavedModel(tensorflow::SessionOptions(), tensorflow::RunOptions(), model_path, tags, bundle) status: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cliFailed to load exported model from /home/aviel/Downloads/drive-download-20180511T163836Z-001. Ensure the model contains the required tags 'serve'.

@yashwantptl7
Copy link

can anyone explain what this SavedModel File is in following error log ?? I was trying to use my trained model(model saved .pb format using frozengraph) for prediction in python

Traceback (most recent call last):
File "export.py", line 12, in
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING],"trained_models/SavedModel/saved_model.pb")
File "/home/yashwantpatel/ptlWS/lib/python3.6/site-packages/tensorflow/python/saved_model/loader_impl.py", line 203, in load
saved_model = _parse_saved_model(export_dir)
File "/home/yashwantpatel/ptlWS/lib/python3.6/site-packages/tensorflow/python/saved_model/loader_impl.py", line 79, in _parse_saved_model
constants.SAVED_MODEL_FILENAME_PB))
OSError: SavedModel file does not exist at: trained_models/SavedModel/saved_model.pb/{saved_model.pbtxt|saved_model.pb}

@Harshini-Gadige
Copy link

Harshini-Gadige commented Sep 24, 2018

@yashwantptl7 Hi Yash, there is another issue(#22480) related to this. Please refer to it and post your observation there. Thank you !

@Harshini-Gadige
Copy link

#22480

@p4rk3r
Copy link

p4rk3r commented Dec 6, 2019

Using Docker version works for me... Although I had to change .deb package version for tensorflow_model_server to 1.15.0

I followed the tutorial described here -> https://medium.com/@yuu.ishikawa/serving-pre-modeled-and-custom-tensorflow-estimator-with-tensorflow-serving-12833b4be421

Created the Dockerfile and run it... Got the same error as described in this thread...
Changed .deb file in Dockerfile from 1.12 to 1.15 and it works now
RUN TEMP_DEB="$(mktemp)" \ && wget -O "$TEMP_DEB" 'http://storage.googleapis.com/tensorflow-serving-apt/pool/tensorflow-model-server-1.15.0/t/tensorflow-model-server/tensorflow-model-server_1.15.0_all.deb' \ && dpkg -i "$TEMP_DEB" \ && rm -f "$TEMP_DEB"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants