You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After padding the RNN output "out" to "padded" with batch_first=True, the first dim of "padded" should be batch_size, and then the operation "padded[0]" takes the first element of a batch. This operation is rare and hard to understand. Am I wrong? Could someone help explain the purpose of this code?
Thanks in advance.
The text was updated successfully, but these errors were encountered:
This is from an old pytorch tutorial that I cannot find now. The purpose is to handle variable size sequences. Nowadays, it is more common to pad sequences and make them equal length at the time of batch creation. This is true for pytorch official examples: https://github.com/pytorch/examples/tree/master/word_language_model
From my understanding, the code indicates that:
After padding the RNN output "out" to "padded" with batch_first=True, the first dim of "padded" should be batch_size, and then the operation "padded[0]" takes the first element of a batch. This operation is rare and hard to understand. Am I wrong? Could someone help explain the purpose of this code?
Thanks in advance.
The text was updated successfully, but these errors were encountered: