New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tensorflow Serving fails to serve tflite model with multiple signatures #2070
Comments
Unfortunately adding logic to allow serving metadata other than signaturedefs is not on the roadmap right now. To support custom metadata in Serving, there is a similar feature request #1248 in works . I would suggest you to +1 that issue and follow the issue for updates. Meanwhile, can you please provide us the 'saved_model_cli' result and try to serve the tensorflow model instead of tflite model and share the finding with us to debug the issue. Thank you! |
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you. |
Sorry, I am away from a proper setup to give you the provided infos, but will be back asap. But, I only want multiple signatures, no extra custom metadata. |
Please help us with the 'saved_model_cli' result for your served model to debug the issue further. Thanks! |
Hello, Using tensorflow:2.8.2 with the
gives
But a curl on the tensorflow/serving docker gives only a "serving_default" signature
gives
After removing the
gives the same result:
However a curl on the tensorflow/serving docker gives a different result with 2 signatures "decode" and "encode".
gives
Thanks for your help |
Interesting to see the TF Serving identifies multiple signature for Tensorflow model, but fails to do so for tflite model. Thank you for providing the "saved_model_cli" and model server metadata. Let me discuss the same with team and we will get back to you. |
Bug Report
System information
Describe the problem
I am playing with quantization on a model with multiple signatures, but when I try to serve it with Tensorflow Serving, the multiple signatures are not recognized and a single default invalid one is generated.
Exact Steps to Reproduce
Here is a minimal code to reproduce the problem:
One can confirm that the model has 2 signatures that a tf.lite.Interpreter will
recognize, the stdout:
One should see in the logs:
serving_default
entrypoint which is a wrappeddecode
(i.e. inputs aredecode_x
instead ofx
) is availableFirst investigations
I reproduce the problem with TFServing 2.9.3 and 2.10.0.
If I use the model used for unit tests in the repo (https://github.com/tensorflow/serving/tree/master/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_tflite_with_sigdef/00000123), I do not have the problem. However, when I check the binary, I see that the headers are different (the latter has some
signature_defs_metadata
field that I do not find in the former freshly created model.The unit test model (saved_model_half_plus_two_tflite_with_sigdef) seems to have been generated with an old code (old version or TF: https://github.com/tensorflow/serving/blob/ef6c4d90ad98dff3507f5af5aa75eab809524a9e/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two.py)
It seems to me that the following little line does not do its job correctly: https://github.com/tensorflow/serving/blob/master/tensorflow_serving/servables/tensorflow/tflite_session.cc#L452
Thank you.
The text was updated successfully, but these errors were encountered: