New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable 'serve' tag-set in create_module_spec function call #19
Comments
+1 |
Any workarounds for this currently? I want to serve a hub module using Tensorflow Serving. |
I have not tested this, but I that suspect you could load the module into an empty graph and then export with |
For now one has to do it manually, example: import tensorflow as tf
import tensorflow_hub as hub
with tf.Graph().as_default():
module = hub.Module("http://tfhub.dev/google/universal-sentence-encoder/2")
text = tf.placeholder(tf.string, [None])
embedding = module(text)
init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])
with tf.Session() as session:
session.run(init_op)
tf.saved_model.simple_save(
session,
"/tmp/serving_saved_model",
inputs = {"text": text},
outputs = {"embedding": embedding}
) Each modules differs slightly from others in input/output names, additionally each serving use case might have different requirements (e.g. raw features in vs serialized tf.Example protos). By having users creating the graph they want to serve (e.g. as done above) seems more flexible than require users to guess what Servo config to use and/or modify the client side each time they change the module being served. |
Closing as this is now obsolete. In TF-2 users should create reusable saved models with tf.saved_model.save(). |
@andresusanopinto quick question, does it mean the pre-trained model on tensorflow hub won't have the |
A TF2 SavedModel can both have signatures for deployment to TensorFlow Serving and tf.functions for reuse in a Python TensorFlow program. See https://www.tensorflow.org/hub/tf2_saved_model#advanced_topic_what_to_expect_from_the_savedmodel_after_loading and https://www.tensorflow.org/guide/saved_model |
@arnoegw thanks for the pointer. I noticed some inconsistency of some TF2 models that some have
My understanding is if |
@chenliu0831, there's nothing wrong with those examples:
|
@arnoegw Ah thanks for clarifying. I give a bad example for the second case since that's a feature vector. I spot check another few on TF hub with the TF2 filter and with
In the detail page of above models I think they all show up as "TF2.0 Saved Model" format. |
It would be useful to export modules in a format that can be consumed by TensorFlow Serving. Servables would increase code reuse and further enable distributed workloads.
Example Usage (note passing "serve" instead of "train"):
The text was updated successfully, but these errors were encountered: