-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Small_Bert] Could not find matching concrete function to call loaded from the SavedModel. #837
Comments
From a quick glance, it looks like your input layers are 1D (
Could you try changing the shapes of the input layers to |
I managed to change these input_word_ids = tf.keras.Input(shape=(None, 128), dtype=tf.int32, name="input_word_ids")
input_mask = tf.keras.Input(shape=(None, 128), dtype=tf.int32, name="input_mask")
input_type_ids = tf.keras.Input(shape=(None, 128), dtype=tf.int32, name="input_type_ids") I am getting something similar
|
Looking at the error there is still shape mismatch: Call argument received: (same for the other inputs) The batch dimensions gets added automatically. So you probably want specify the input spec to be (128) which will result in (None, 128). You can also just follow a more streamlined example and directly pass the output of the pre-processor into the encoder, assuming the preprocessing is one of the supported preprocessors (e.g https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/3) or is compatible with expected interface.
Detailed example: https://colab.sandbox.google.com/github/tensorflow/text/blob/master/docs/tutorials/bert_glue.ipynb#scrollTo=KeHEYKXGqjAZ |
I managed to apply the following approach available here that is using the preprocessing layer to turn text into with tf.device('/cpu:0'):
train_data = tf.data.Dataset.from_tensor_slices((data_train.data, tf.keras.utils.to_categorical(data_train.target)))
valid_data = tf.data.Dataset.from_tensor_slices((data_test.data, tf.keras.utils.to_categorical(data_test.target)))
for text,label in train_data.take(1):
print(text)
print(label)
def build_classifier_model():
text_input = tf.keras.layers.Input(shape=(), dtype=tf.string, name='text')
preprocessing_layer = hub.KerasLayer("https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/3")
encoder = hub.KerasLayer("https://tfhub.dev/tensorflow/small_bert/bert_en_uncased_L-4_H-512_A-8/2", trainable=True, name='BERT_encoder')
encoder_inputs = preprocessing_layer(text_input)
encoder = hub.KerasLayer(encoder, trainable=True, name='BERT_encoder')
outputs = encoder(encoder_inputs)
net = outputs['pooled_output']
net = tf.keras.layers.Dropout(0.1)(net)
net = tf.keras.layers.Dense(10, activation=None, name='classifier')(net)
return tf.keras.Model(text_input, net)
classifier_model = build_classifier_model()
classifier_model.compile(optimizer="adam",
loss="categorical_crossentropy",
metrics="accuracy")
history = classifier_model.fit(x=train_data,
validation_data=valid_data,
epochs=10) output:
|
I was having the same issue using tf.data.Dataset.from_tensor_slices() with the data in a single text file and loaded into a dataframe. Restructured the data so I could load it using tf.keras.utils.text_dataset_from_directory and everything worked fine. |
Apologies for the delay and while using BERT preprocessing model from TFHub, Please use below commands to install
If your issue got resolved with above workaround, please feel free to close this issue Thank you! |
Hi, @jeankhawand Closing this issue due to lack of recent activity for couple of weeks. Please feel free to reopen the issue with more details( if possible please help us with complete code with dataset to do troubleshooting to find out root cause) if the problem still persists after trying above workaround. Thank you! |
@gaikwadrahul8 I have the exact same problem as @jeankhawand. I followed your advice about making sure tensorflow and tensorflow-text are the same version. This did not fix this issue for me. Do you have any ideas of other things to try in order to resolve this error? |
I am trying to build a text classification program with small bert using the following code
that's how the output of
text
andlabel
looks from Datasettrain_data
while running this example I am getting the following output error
Resources
Classify Text with Bert
Bert Layer
Bert Preprocessing Layer
Tensorflow v2.7.0
Tensorflow Hub v0.12.0
Tensorflow Text v2.7.3
Python v3.7.12
The text was updated successfully, but these errors were encountered: