-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error with exporting TF2.2.0 model with tf.lookup.StaticHashTable & LSTM layer for Serving #1719
Comments
@spate141, |
@rmothukuru Please find the code from this colab notebook: https://colab.research.google.com/drive/1ch89Veylgg-0FzqGeC4QKDui-a01FKzp?usp=sharing I've added following sections:
|
Seems like I solved it! I would appreciate if you can add a version of this to the documentation so other can use it. To make the variables and the other elements from outside trackable, we need to write the Lambda layer using the subclassed Layer: class LabelConverter(tf.keras.layers.Layer):
def __init__(self, **kwargs):
super(LabelConverter, self).__init__(**kwargs)
# Implement your StaticHashTable here
keys = tf.constant([0, 1], dtype=tf.int32)
values = tf.constant([1000, 2000], dtype=tf.float32)
table_init = tf.lookup.KeyValueTensorInitializer(keys, values)
self.table = tf.lookup.StaticHashTable(table_init, -1)
def build(self, input_shape):
self.built = True
def call(self, tensor_input):
# this block is doing the transformation on input
label_tensor = tf.cast(tensor_input[:, 0], tf.int32)
score_tensor = tensor_input[:, 1]
categories_tensor = self.table.lookup(label_tensor)
return tf.stack((categories_tensor, score_tensor), axis=1)
# adding on top of already trained keras model
extra_layer = LabelConverter()(model.output)
hash_table_model = tf.keras.models.Model(inputs=model.input, outputs=extra_layer)
version = 1
name = 'tmp_test_serving'
export_path = f'/data/{name}/{version}'
tf.saved_model.save(hash_table_model, export_path) EDIT: Doesn't work on TF-Serving for some reason! |
I am having a similar issue (see stackoverflow) but I am not sure how to solve this in my case. How can I localize the Tensor |
This solution is not working with the TF 2.5.0 and TF-Serving! I can export the model and can even load it with the serving but during prediction, |
I have tried to export my text-classification model, built and trained using tf.keras as shown below, and I get the same error. I have used tensorflow 2.7.0 on Ubuntu 18.04(google colab) to train and save the model. my code:
but, I get this error:
How do I fix this and save the model with properly? |
Also facing this issue, it seems almost impossible to fix :/ Anybody made some progress since then? |
Also facing this issue, how to fix? |
Facing the same issue which seems to have been open for 2 years now. Anything new? |
To anyone facing this issue, make sure you're not defining trainable layers as class attributes in your sub-classes. It produced a similar error in my case. |
Hi @stefan-falk , I think the part of the error pointing to the static hash table is quite explicit:
|
I may be able to offer a solution, as I have recently encountered this error in a similar way. It turns out we need to save the HashTable as one of model's properties. In this specific example, ## create an instance
tf_model_wrapper = TFModel(model)
# trying to create concrete_function as mentioned on github issue
concrete_fn = tf_model_wrapper.f().get_concrete_function(comment=tf.TensorSpec([None], tf.string))
## save the model to disk(serialize it)
model_to_save = tf_model_wrapper.model
model_to_save.hash_table = tf_model_wrapper.hash_table
tf.keras.models.save_model(
model=model_to_save,
filepath='/content/complex_nw_v1',
signatures={'serving_default': concrete_fn}) The actual name of the attribute probably doesn't matter. |
@spate141/ All,
Also, you can try saving static HashTable as one of model's properties as shown in above comment. Thank you! |
Thanks @singhniraj08 for the reply! Closing this issue for now since I'm not working/dealing with this error but hopefully someone in future will encounter something similar and will find the relevant help they need from this. |
this comment worked for me. Even though |
System information
Related Issue with TF-Serving documentation:
As mentioned here on #1606; we need a better documentation about exporting TF2.X models to TF-Serving that involves StaticHashTables. Simply disabling eager execution works on most cases but if you're using the new LSTM layer from TF2.2.0; it won't give you the power of CUDA as mentioned here as the key requirement of LSTM cuDNN implementation (7. Eager execution is enabled in the outermost context.)
Related post on tensorflow/tensorflow:
tensorflow/tensorflow#42325
My Issue:
I'm using
StaticHashTable
as in one Lambda layer after the output layer of my tf.keras model. It's quite simple actually: I've a text classification models and I'm adding a simple lambda layer that takes themodel.output
and convert the model_id to more general labels. I can save this version of model with model.save(... as H5 format..) without any issue, and can load it back and use it without any problem.Issue is, when I try to export my TF2.2.0 model for TF-Serving, I can't find how I can export it. Here is what I can do with TF1.X or with
TF2.X + tf.compat.v1.disable_eager_execution()
This will save my models with TF1.X format for serving and I can use it without any issue. Things is, I'm using LSTM layer and I want to use my model on GPU. By the documentation, if I disable the eager mode, I can't use the GPU-version of LSTM with TF2.2. And without going through above mentioned code, I can't save my model for serving wrt TF2.2 standard and StaticHashTables.
Here is how I'm trying to export my TF2.2 model which is using StaticHashTables in final layer; and which is giving error as below:
Error:
Any suggestion or am I missing anything on exporting TF2.2 model which is using the
StaticHashTables
in final Lambda layer for TensorFlow Serving?Thanks!
The text was updated successfully, but these errors were encountered: