New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation Request: dropout between gates in LSTM #335
Comments
supakjk
changed the title
Implementation request: dropout between gates in LSTM
Implementation Tequest: dropout between gates in LSTM
Sep 2, 2016
@supakjk bayesian dropout on GRU is implemented but yet to LSTM. However, I've already implemented it recently for my experiment. When the test is done, I'll send a PR including testcases. |
supakjk
changed the title
Implementation Tequest: dropout between gates in LSTM
Implementation Request: dropout between gates in LSTM
Sep 9, 2016
It would also be great if we can easily set the initial forget gate bias. |
Can you check the implementation? You can use nn.LSTM or nn .FastLSTM . You can access submodule via |
Refer to #382. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
http://arxiv.org/abs/1512.05287 shows the usage of dropout between gates in LSTM. It would be great if we can also easily use such "inside" dropouts in rnn modules like FastLSTM.
Keras already supports it https://keras.io/layers/recurrent/
The text was updated successfully, but these errors were encountered: