Skip to content
This repository has been archived by the owner on Apr 25, 2023. It is now read-only.

in model.py line 76: context = attn_weights.bmm(encoder_outputs.transpose(0, 1)) # (B,1,N) #16

Closed
Huijun-Cui opened this issue Dec 8, 2018 · 1 comment

Comments

@Huijun-Cui
Copy link

Is this correct? I look up the explanation about bmm in official document,it syas that
batch1 and batch2 must be 3-D tensors each containing the same number of matrices.
but as you defined before, the attn_weights is a 2-D shape ,I think here may be some mistakes

@Hahallo
Copy link

Hahallo commented Feb 2, 2021

return F.softmax(attn_energies, dim=1).unsqueeze(1)
This line of code makes attn_weights 3-D shape

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants