Author: Pham Quang Nhat Minh
# Convert labels to categorical one-hot encoding
one_hot_labels = keras.utils.to_categorical(labels, num_classes=10)
# For custom metrics
import keras.backend as K
def mean_pred(y_true, y_pred):
return K.mean(y_pred)
model.compile(optimizer='rmsprop',
loss='binary_crossentropy',
metrics=['accuracy', mean_pred])
- keras-attention-mechanism
- How to Develop an Encoder-Decoder Model with Attention for Sequence-to-Sequence Prediction in Keras
- How to add Attention on top of a Recurrent Layer (Text Classification)
- Position-based Content Attention for Time Series Forecasting with Sequence-to-sequence RNNs
- keras-language-modeling
- Understanding emotions — from Keras to pyTorch
- cbaziotis/Attention.py
- Attention in Long Short-Term Memory Recurrent Neural Networks
pip install keras --upgrade
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(act)
Tham khảo:
https://stackoverflow.com/questions/34717241/how-to-use-advanced-activation-layers-in-keras