-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Description
I am new to tensorflowserving.
Now I have trained a model using Keras and saved as a mode.h5 file contain both weight, config.
Then I export the model and obatin a .pb and variable files. like this
export/
1/
save_model.pb
variables/
variables.data-00000-of-00001
variables.index
When I try to use tensorflow_model_server to hold the exported model as sever:
bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=1 --model_base_path=export/1/
I got such error:
tensorflow_serving/core/loader_harness.cc:104] Servable: {name: 1 version: 1} load failure: Not found: export/1/export.meta
It works well in https://www.tensorflow.org/serving/serving_basic but not my case.
Someone could give me some tips about this? I appreciate it.
Btw, the other issue I am confusing if tensorflowfserving can speed up my prediction step, or just a lifescycle manager?
Here is my export.py code---------------
from keras import backend as K
import tensorflow as tf
from keras.models import load_model
from tensorflow.python.saved_model import builder as saved_model_builder
from tensorflow.python.saved_model import tag_constants, signature_constants
from tensorflow.python.saved_model.signature_def_utils_impl import build_signature_def, predict_signature_def
import shutil
import os
model_path=keras-serving/result/model.h5'
model=load_model(model_path)
export_path = 'export/1' # where to save the exported graph
builder = saved_model_builder.SavedModelBuilder(export_path)
signature = predict_signature_def(inputs={"inputs": model.input},
outputs={"outputs": model.output})
with K.get_session() as sess:
builder.add_meta_graph_and_variables(sess=sess,
tags=[tag_constants.SERVING],
signature_def_map={'predict': signature})
builder.save()