New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dropout hidden-to-hidden transition within an RNN #13103
Comments
@ebrevdo, can you take a look at this? |
It has been 14 days with no activity and the |
1 similar comment
It has been 14 days with no activity and the |
Nagging Awaiting TensorFlower: It has been 14 days with no activity and the |
1 similar comment
Nagging Awaiting TensorFlower: It has been 14 days with no activity and the |
I believe we support this now. |
@ebrevdo Can you please comment which Tensorflow function supports this ? |
The DropoutWrapper should support intelligent hidden-to-hidden dropout.
…On Fri, Jun 29, 2018, 11:17 PM vivekverma239 ***@***.***> wrote:
@ebrevdo <https://github.com/ebrevdo> Can you please comment which
Tensorflow function supports this ?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#13103 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABtim7BUIEyduO1K6q6D_c2vkZOTuqKbks5uBxfcgaJpZM4PaQ-T>
.
|
@ebrevdo Can DropoutWrapper achieve DropConnect? |
DropoutWrapper
allows to apply dropout to either the cell's inputs, outputs or states. However, I haven't seen an option to do the same thing for the recurrent weights of the cell (for example, 4 out of the 8 different matrices used in the original LSTM formulation). I specifically refer to the hidden-to-hidden transition within an RNN. Take as an example Section 2 from https://arxiv.org/abs/1708.02182The text was updated successfully, but these errors were encountered: