Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable 'serve' tag-set in create_module_spec function call #19

Closed
samsends opened this issue Apr 9, 2018 · 10 comments
Closed

Enable 'serve' tag-set in create_module_spec function call #19

samsends opened this issue Apr 9, 2018 · 10 comments
Labels
hub For all issues related to tf hub library and tf hub tutorials or examples posted by hub team type:feature

Comments

@samsends
Copy link

samsends commented Apr 9, 2018

It would be useful to export modules in a format that can be consumed by TensorFlow Serving. Servables would increase code reuse and further enable distributed workloads.

Example Usage (note passing "serve" instead of "train"):

hub.create_module_spec(
    module_fn,
    tags_and_args=[({"serve"}, {"is_training":False})],
    drop_collections=None)
@samsends samsends changed the title Create 'serve' tag-set in create_module_spec function call Enable 'serve' tag-set in create_module_spec function call Apr 9, 2018
@navneetrao
Copy link

navneetrao commented May 11, 2018

+1
Making hub modules consumable by Tensorflow Serving would be very helpful.

@warynice
Copy link

Any workarounds for this currently? I want to serve a hub module using Tensorflow Serving.

@samsends
Copy link
Author

samsends commented Jul 15, 2018

I have not tested this, but I that suspect you could load the module into an empty graph and then export with savedmodelbuilder. We could build an automated tool.

@andresusanopinto
Copy link
Contributor

For now one has to do it manually, example:

import tensorflow as tf
import tensorflow_hub as hub

with tf.Graph().as_default():
  module = hub.Module("http://tfhub.dev/google/universal-sentence-encoder/2")
  text = tf.placeholder(tf.string, [None])
  embedding = module(text)
  
  init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])
  with tf.Session() as session:
    session.run(init_op)
    tf.saved_model.simple_save(
        session,
        "/tmp/serving_saved_model",
        inputs = {"text": text},
        outputs = {"embedding": embedding}        
    )

Each modules differs slightly from others in input/output names, additionally each serving use case might have different requirements (e.g. raw features in vs serialized tf.Example protos). By having users creating the graph they want to serve (e.g. as done above) seems more flexible than require users to guess what Servo config to use and/or modify the client side each time they change the module being served.

@Harshini-Gadige Harshini-Gadige added the hub For all issues related to tf hub library and tf hub tutorials or examples posted by hub team label Mar 14, 2019
@andresusanopinto
Copy link
Contributor

Closing as this is now obsolete.

In TF-2 users should create reusable saved models with tf.saved_model.save().

@chenliu0831
Copy link

@andresusanopinto quick question, does it mean the pre-trained model on tensorflow hub won't have the serving_default signature by default and the user need to re-export it? Thanks

@arnoegw
Copy link
Contributor

arnoegw commented Mar 10, 2020

A TF2 SavedModel can both have signatures for deployment to TensorFlow Serving and tf.functions for reuse in a Python TensorFlow program.

See https://www.tensorflow.org/hub/tf2_saved_model#advanced_topic_what_to_expect_from_the_savedmodel_after_loading and https://www.tensorflow.org/guide/saved_model

@chenliu0831
Copy link

chenliu0831 commented Mar 10, 2020

@arnoegw thanks for the pointer. I noticed some inconsistency of some TF2 models that some have serving_default signature while some does not. For example:

My understanding is if serving_default is not present, the model cannot be served as-is in TF serving. Should all TF2 models have this signature? Let me know if I should open a new issue to track this.

@arnoegw
Copy link
Contributor

arnoegw commented Mar 10, 2020

@chenliu0831, there's nothing wrong with those examples:

@chenliu0831
Copy link

chenliu0831 commented Mar 10, 2020

@arnoegw Ah thanks for clarifying. I give a bad example for the second case since that's a feature vector.

I spot check another few on TF hub with the TF2 filter and with classification variants and looks like their signature map is empty as well:

In the detail page of above models I think they all show up as "TF2.0 Saved Model" format.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
hub For all issues related to tf hub library and tf hub tutorials or examples posted by hub team type:feature
Projects
None yet
Development

No branches or pull requests

7 participants