Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about back propagation #9

Open
dongyp13 opened this issue Apr 13, 2016 · 0 comments
Open

Question about back propagation #9

dongyp13 opened this issue Apr 13, 2016 · 0 comments

Comments

@dongyp13
Copy link

In cudamat/cudamat_kernels.cu line 1934, the code is "if (!init) d_c_in[p] = grad_c;".

But in the first state of decoder and future LSTM, both of their init_state are the last state of encoder. So I guess that the derivation of encoder's last cell state should be propagated from both decoder and future LSTM. Should the code be "if (!init) d_c_in[p] += grad_c;" ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant