New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exporting LSTM with TFlite Converter yields model which makes bad predictions in contrast to keras model #55835
Comments
Hi @leeflix ! Could you let us know reason behind not using Select Ops(As it is giving already correct predictions right)? |
When using Select Ops I have to link way bigger binaries. |
You can reduce the binary size of your model using selective builds https://www.tensorflow.org/lite/guide/reduce_binary_size. |
Thanks for the suggestion. I will look into it, but can someone explain why option 2 produces a model that does bad predictions? |
Click to expand!
Issue Type
Bug
Source
source
Tensorflow Version
tf 2.8
Custom Code
No
OS Platform and Distribution
No response
Mobile device
No response
Python version
No response
Bazel version
No response
GCC/Compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current Behaviour?
Standalone code to reproduce the issue
Relevant log output
No response
The text was updated successfully, but these errors were encountered: