You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I need to add attention to my following model. It works perfectly for LSTM model but I get the below error :
def get_ANN_attention_model(num_hidden_layers, num_neurons_per_layer, dropout_rate, activation_func, train_X):
with tf.device('/gpu:0'):
model_input = tf.keras.Input(shape=(train_X.shape[1])) # input layer.
for i in range(num_hidden_layers):
x = layers.Dense(num_neurons_per_layer,activation=activation_func,bias_regularizer=L1L2(l1=0.0, l2=0.0001),activity_regularizer=L1L2(1e-5,1e-4))(model_input)
x = layers.Dropout(dropout_rate)(x)
x = Attention(num_hidden_layers)(x)
outputs = layers.Dense(1, activation='linear')(x)
model = tf.keras.Model(inputs=model_input, outputs=outputs)
model.summary()
return model
ERROR
hidden_size = int(hidden_states.shape[2])
File "C:\Users\bhask\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\framework\tensor_shape.py", line 896, in getitem
return self._dims[key].value
IndexError: list index out of range
The text was updated successfully, but these errors were encountered:
@bhaskatripathi thanks for your interest in the project. The attention layer expects a 3-D input of shape (batch_size, time_steps, input_dim). It will not work for a MLP network. Because the input you would give has a 2-D shape.
I need to add attention to my following model. It works perfectly for LSTM model but I get the below error :
ERROR
hidden_size = int(hidden_states.shape[2])
File "C:\Users\bhask\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\framework\tensor_shape.py", line 896, in getitem
return self._dims[key].value
IndexError: list index out of range
The text was updated successfully, but these errors were encountered: