Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model serving signature with SparseTensor input feature #52605

Open
sapphire008 opened this issue Oct 21, 2021 · 8 comments
Open

Model serving signature with SparseTensor input feature #52605

sapphire008 opened this issue Oct 21, 2021 · 8 comments
Assignees
Labels
comp:ops OPs related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.8 type:bug Bug

Comments

@sapphire008
Copy link

This is a bug report.


System information

  • Have I written custom code (as opposed to using a stock example script
    provided in TensorFlow)
    : Yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): MacOSX 11.5.2, GCP Linux hosted instance
  • TensorFlow installed from (source or binary): TFX 1.0.0 docker image on Linux; on OSX installed binary with pip
  • TensorFlow version (use command below): 2.5.0
  • Python version: 3.8.1 and 3.8.5 (Mac), 3.7 (Linux)
  • Bazel version (if compiling from source): N/A
  • GCC/Compiler version (if compiling from source): N/A
  • CUDA/cuDNN version: N/A CPU only
  • GPU model and memory: N/A
  • Exact command to reproduce: N/A

Describe the problem

When saving the model signature function with sparse tensor input, (generated from get_concrete_function or tf.function(input_signatures=...) decorator), the recovered model signature no longer accepts sparse tensor as inputs. Here is an example script:

import tensorflow as tf

# Mock a model
input_x = tf.keras.Input(1)
output_y = tf.keras.layers.Dense(1)(input_x)
model = tf.keras.Model(input_x, output_y)

def _get_serving_signature(model):
    @tf.function
    def my_func(x, y):
        x_out = tf.cast(tf.sparse.to_indicator(x, 5), tf.int64)
        return {"x": x_out, "y": y}
    
    return my_func

# Get the concrete func
concrete_func = _get_serving_signature(model).get_concrete_function(
        x=tf.SparseTensorSpec(shape=[None, None], dtype=tf.int64),
        y=tf.TensorSpec(shape=[None, 1], dtype=tf.int64),
    )

# Store this function inside the model
model.my_func = concrete_func
    
# Build the signature dict
signatures = { "default": concrete_func}

# save the model
model.save("./serving_dir", save_format="tf", signatures=signatures)

# Load the model back
model2 = tf.keras.models.load_model("./serving_dir")

# Make up some data
x = tf.ragged.constant([[1, 3], [2, 3, 1], [2]], dtype=tf.int64).to_sparse()
y = tf.expand_dims(tf.constant([1, 2, 1], dtype=tf.int64), axis=1)

# Make inference on saved my_func
out_func_attr = model2.my_func(x=x, y=y)

# Make inference using signature
out_signature = model2.signatures["default"](x, y)

During saving of the model, a warning message appears:

WARNING:absl:Function `my_func` contains input name(s) x with unsupported characters which will be renamed to x_2 in the SavedModel.

Calling the saved function as a model attribute returns the correct output

out_func_attr = model2.my_func(x=x, y=y)
{'y': <tf.Tensor: shape=(3, 1), dtype=int64, numpy=
  array([[1],
        [2],
        [1]])>,
  'x': <tf.Tensor: shape=(3, 5), dtype=int64, numpy=
  array([[0, 1, 0, 1, 0],
        [0, 1, 1, 1, 0],
        [0, 0, 1, 0, 0]])>}

Make inference using signature

out_signature = model2.signatures["default"](x, y)

returns the following error:

Traceback (most recent call last):

  File "/Users/edward/opt/anaconda3/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 1724, in _call_impl
return self._call_with_flat_signature(args, kwargs,

  File "/Users/edward/opt/anaconda3/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 1748, in _call_with_flat_signature
raise TypeError(

TypeError: signature_wrapper(x, x_1, x_2, y) takes 0 positional arguments but 2 were given


During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/Users/edward/Desktop/sparse_serving.py", line 61, in <module>
    out_signature = model2.signatures["default"](x, y)

  File "/Users/edward/opt/anaconda3/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 1711, in __call__
    return self._call_impl(args, kwargs)

  File "/Users/edward/opt/anaconda3/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 1727, in _call_impl
    raise structured_err

  File "/Users/edward/opt/anaconda3/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 1720, in _call_impl
    return self._call_with_structured_signature(args, kwargs,

  File "/Users/edward/opt/anaconda3/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 1798, in _call_with_structured_signature
    self._structured_signature_check_missing_args(args, kwargs)

  File "/Users/edward/opt/anaconda3/lib/python3.8/site-packages/tensorflow/python/eager/function.py", line 1817, in _structured_signature_check_missing_args
    raise TypeError("{} missing required arguments: {}".format(

TypeError: signature_wrapper(*, x_2, y, x_1, x) missing required arguments: x, x_1, x_2, y

The saved function attribute model2.my_func is

 <ConcreteFunction my_func(x, y) at 0x7F9D824BBF10>

which is expected.

The signature model2.signatures["default"] is a

<ConcreteFunction signature_wrapper(*, x_2, y, x_1, x) at 0x7F9D824C5A90>

which is incorrect.

One can also easily modify the above script and verify him/herself that, if x is a dense tensor, with TensorSpec rather than SparseTensorSpec, the serving signature no longer has this problem.

@tilakrayal tilakrayal added TF 2.5 Issues related to TF 2.5 comp:ops OPs related issues type:bug Bug labels Oct 21, 2021
@tilakrayal
Copy link
Contributor

tilakrayal commented Oct 21, 2021

@sanatmpa1 ,
I was able to reproduce the issue in tf v2.6 and nightly.Please find the gist here.

[EDIT] This is still an issue with tf-nightly(2.9.0-dev20220309)

@tilakrayal tilakrayal assigned sanatmpa1 and unassigned tilakrayal Oct 21, 2021
@jvishnuvardhan jvishnuvardhan added stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.8 and removed TF 2.5 Issues related to TF 2.5 labels Mar 9, 2022
@k-w-w
Copy link
Contributor

k-w-w commented Sep 13, 2022

Hi, this is a problem that we are aware of. The current recommendation is switching to saving the function with the model instead of using the signatures argument (which is there mainly for TensorFlow Serving interfacing).

For example:

model = ...

# Before
model.save(..., signatures={"default": serving_fn})

# After
model.serving_fn = serving_fn
model.save(...)

When loading, you can access the serving_fn directly from the model.

@tilakrayal
Copy link
Contributor

@sapphire008,
Could you please take a look at the comment from the developer and try to test in the latest stable version 2.10. Thank you!

@tilakrayal tilakrayal added the stat:awaiting response Status - Awaiting response from author label Sep 16, 2022
@google-ml-butler
Copy link

This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.

@google-ml-butler google-ml-butler bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Sep 23, 2022
@google-ml-butler
Copy link

Closing as stale. Please reopen if you'd like to work on this further.

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@sapphire008
Copy link
Author

Confirming that this is still an issue as of 2.10.0. Workaround above is not acceptable since TF Serving will not be able to serve a function stored this way. Please reopen this issue @tilakrayal.

@sapphire008
Copy link
Author

sapphire008 commented Oct 3, 2022

Hi, this is a problem that we are aware of. The current recommendation is switching to saving the function with the model instead of using the signatures argument (which is there mainly for TensorFlow Serving interfacing).

For example:

model = ...

# Before
model.save(..., signatures={"default": serving_fn})

# After
model.serving_fn = serving_fn
model.save(...)

When loading, you can access the serving_fn directly from the model.

The need is mainly for saving and serving with the signatures during production. This workaround does not help with the use case in production.

@tilakrayal tilakrayal removed stat:awaiting response Status - Awaiting response from author stale This label marks the issue/pr stale - to be closed automatically if no activity labels Oct 4, 2022
@tilakrayal tilakrayal reopened this Oct 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:ops OPs related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.8 type:bug Bug
Projects
None yet
Development

No branches or pull requests

6 participants