Skip to content
This repository has been archived by the owner on Mar 3, 2024. It is now read-only.

Latest commit

 

History

History
61 lines (41 loc) · 1.84 KB

README.zh-CN.md

File metadata and controls

61 lines (41 loc) · 1.84 KB

Keras Ordered Neurons LSTM

Version

[中文|English]

ON-LSTM的非官方实现。

安装

pip install keras-ordered-neurons

使用

基本

使用起来和LSTM基本一致,默认情况下还需要一个chunk_size参数,代表master gates缩小的倍数:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, Bidirectional, Dense

from keras_ordered_neurons import ONLSTM

model = Sequential()
model.add(Embedding(input_shape=(None,), input_dim=10, output_dim=100))
model.add(Bidirectional(ONLSTM(units=50, chunk_size=5)))
model.add(Dense(units=2, activation='softmax'))
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy')
model.summary()

DropConnect

参数中的recurrent_dropconnect用于设置隐藏状态权重矩阵的随机归零概率:

from keras_ordered_neurons import ONLSTM

ONLSTM(units=50, chunk_size=5, recurrent_dropconnect=0.2)

获取期望分割点

return_splits设置为True来返回master forget gate和master input gate的期望分割点:

from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Embedding

from keras_ordered_neurons import ONLSTM

inputs = Input(shape=(None,))
embed = Embedding(input_dim=10, output_dim=100)(inputs)
outputs, splits = ONLSTM(units=50, chunk_size=5, return_sequences=True, return_splits=True)(embed)
model = Model(inputs=inputs, outputs=splits)
model.compile(optimizer='adam', loss='mse')
model.summary(line_length=120)