You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 3, 2024. It is now read-only.
When I try to get the attention as shown in https://github.com/CyberZHG/keras-bert/blob/master/demo/load_model/load_and_get_attention_map.py, I get the following error :
Output tensors of a Functional model must be the output of a TensorFlow
Layer
(thus holding past layer metadata). Found: tf.TensorMinimal Codes To Reproduce
inputs = model.inputs[:2]
outputs = (model.get_layer('Encoder-1-MultiHeadSelfAttention').attention)
model = keras.models.Model(inputs , outputs)
** Have tried with the function too as shown in the example and that shows similar error too.
The text was updated successfully, but these errors were encountered: