Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The call of LSTM may be different from official documentation of Pytorch #31

Closed
chaoxianluo opened this issue May 30, 2020 · 2 comments
Closed

Comments

@chaoxianluo
Copy link

chaoxianluo commented May 30, 2020

Hello, I have cloned your code and run successfully using trained model you provide. I got resonable results. However I noticed that the call of LSTM in your implentation is different from what offical documentation says.
According to official documentation, the input of LSTM should be of shape (seq_len, batch, input_size), however in your code, it's of shape (batch_size, seq_len, input_size).
I wonder whether I referenced documentation with mismatched version or there may be some error in your code.
Thanks for the share of your implementation.

ref documentation: https://pytorch.org/docs/master/generated/torch.nn.LSTM.html

@chaoxianluo chaoxianluo changed the title The call of LSTM differ from official documentation of Pytorch The call of LSTM may be different from official documentation of Pytorch May 30, 2020
@DexterFixxor
Copy link

Hello,

As you can see in the documentation, in the list of arguments there is "batch_first". That explains it.

@chaoxianluo
Copy link
Author

Hello,

As you can see in the documentation, in the list of arguments there is "batch_first". That explains it.

OK, I notice it, thank you very much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants