New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model serving signature with SparseTensor input feature #52605
Comments
@sanatmpa1 , [EDIT] This is still an issue with |
Hi, this is a problem that we are aware of. The current recommendation is switching to saving the function with the model instead of using the For example:
When loading, you can access the |
@sapphire008, |
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you. |
Closing as stale. Please reopen if you'd like to work on this further. |
Confirming that this is still an issue as of 2.10.0. Workaround above is not acceptable since TF Serving will not be able to serve a function stored this way. Please reopen this issue @tilakrayal. |
The need is mainly for saving and serving with the signatures during production. This workaround does not help with the use case in production. |
This is a bug report.
System information
provided in TensorFlow): Yes
Describe the problem
When saving the model signature function with sparse tensor input, (generated from
get_concrete_function
ortf.function(input_signatures=...)
decorator), the recovered model signature no longer accepts sparse tensor as inputs. Here is an example script:During saving of the model, a warning message appears:
Calling the saved function as a model attribute returns the correct output
Make inference using signature
returns the following error:
The saved function attribute
model2.my_func
iswhich is expected.
The signature
model2.signatures["default"]
is awhich is incorrect.
One can also easily modify the above script and verify him/herself that, if
x
is a dense tensor, withTensorSpec
rather thanSparseTensorSpec
, the serving signature no longer has this problem.The text was updated successfully, but these errors were encountered: