New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Infinity loop when try to save model with input_signatures on funtion decorator #65256
Comments
For simplify the bug reprodution follow the code where the inputs is two simple dict of tensors and one int tensor.
and save model:
when i save the model without function decorator and inputs_signatures i have the follow error on calling loaded model: |
Hi @Chm-vinicius , Could you please minimal code snippet for reproducing the issue? Thanks! It seems the code contains tensorflow recommender layers which we are not supporting here at TF repo. I can see this is already reported in concerned repo. If you can able to reproduce the issue with TF/Keras please submit the code snippet for same. Thanks! |
Hi @SuryanarayanaY, Thanks for your response! |
Issue type
Bug
Have you reproduced the bug with TensorFlow Nightly?
No
Source
source
TensorFlow version
2.15.1
Custom code
Yes
OS platform and distribution
Linux; Windows
Mobile device
No response
Python version
3.9.5
Bazel version
No response
GCC/compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current behavior?
I have a custom BruteForce method, it's almost a copy of tfrs.layers.factorized_top_k.BruteForce differing on inputs as candidates are one variable input like the queries on original method. I was abble to save model and load without any issues, i can up the tensorflow serve as well, but when i call the served endpoint I receive the message "Serving signature name: "serving_default" not found in signature def", when i try to setup the input signatures on call function, be it list of tensors dict or dataset of tensors dict, the kernel get in infinity loop and dont output anything, the issue occurs on windows and on gcloud vertex AI workbranch with a linux os env.
On the code bellow candidaster and data_ds is a datasets of tensors dicts;
The model are two towers recommendation model were "self.model.query_model" and "self.model.candidate_model" are concatenation of embeddings and some normalizations
Standalone code to reproduce the issue
Relevant log output
No response
The text was updated successfully, but these errors were encountered: