New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A question #8
Comments
Hi @Melika-Ayoughi, and thank you for your interest. I'm not sure I understood your concern. Please forgive me if it's not. D |
No it's not. I don't have problem with the order. So first of all W is the same for all lines, so instead of having W_xi, W_xf, W_hi, W_hf, ... u have one W. |
great! I still don't understand how your implementation has all these weights and not just one weight. can u explain that? because u only convolve x_t and w once. so I assumed there would only be one w and not 8! |
Sure! It's a common trick when dealing with the LSTM family.
Does this make sense? D |
The reported formulation of the ConvLSTM was first introduced in this paper (I believe): https://arxiv.org/pdf/1506.04214.pdf. |
Hi, I'm curious are there any particular reasons for implementing this variant of the LSTM model? Rather than the original one from the paper https://arxiv.org/pdf/1506.04214.pdf? I'm specifically asking about using Thank you! |
I have a question regarding your implementation:
As I understood the original convolutional lstm formulation is as follows:
But in your implementation, u used only one convolution layer. I don't understand how these 2 correspond with each other. because in the formulation, c is only used in the Hadamard product and not in convolutions, but here c and h are both used in convolutions.
in fact, all weights are shared for all 4 formulas, although there are 11 weights in the original formula.
The text was updated successfully, but these errors were encountered: