Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation Request: dropout between gates in LSTM #335

Closed
supakjk opened this issue Sep 2, 2016 · 4 comments · Fixed by #382
Closed

Implementation Request: dropout between gates in LSTM #335

supakjk opened this issue Sep 2, 2016 · 4 comments · Fixed by #382

Comments

@supakjk
Copy link
Contributor

supakjk commented Sep 2, 2016

http://arxiv.org/abs/1512.05287 shows the usage of dropout between gates in LSTM. It would be great if we can also easily use such "inside" dropouts in rnn modules like FastLSTM.
Keras already supports it https://keras.io/layers/recurrent/

@supakjk supakjk changed the title Implementation request: dropout between gates in LSTM Implementation Tequest: dropout between gates in LSTM Sep 2, 2016
@jnhwkim
Copy link
Contributor

jnhwkim commented Sep 8, 2016

@supakjk bayesian dropout on GRU is implemented but yet to LSTM. However, I've already implemented it recently for my experiment. When the test is done, I'll send a PR including testcases.

@supakjk supakjk changed the title Implementation Tequest: dropout between gates in LSTM Implementation Request: dropout between gates in LSTM Sep 9, 2016
@supakjk
Copy link
Contributor Author

supakjk commented Sep 9, 2016

It would also be great if we can easily set the initial forget gate bias.

@jnhwkim
Copy link
Contributor

jnhwkim commented Oct 20, 2016

Can you check the implementation? You can use nn.LSTM or nn .FastLSTM .

You can access submodule via self.recurrentModule.

@jnhwkim
Copy link
Contributor

jnhwkim commented Jan 26, 2017

Refer to #382.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants