-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mask for attention #30
Comments
I believe this is an issue but is acceptable. Since examples are sorted by the length of the source sentence, during training the length of all source sentences within a batch should be the same for most of the time. Also note that |
marcotcr
pushed a commit
to marcotcr/OpenNMT-py
that referenced
this issue
Sep 20, 2017
Task fixes, changes multiworld.display a tiny bit, added a few init.py's, and fixed a few docs.
marcotcr
pushed a commit
to marcotcr/OpenNMT-py
that referenced
this issue
Sep 20, 2017
* start * batch cleanup, prepare for multiagent batch * oops * generalize display * cleaning single vs. multi + comments * cleaning single vs. multi + comments * cleaning single vs. multi + comments * small update * small fixes * Tasks (OpenNMT#30) Task fixes, changes multiworld.display a tiny bit, added a few init.py's, and fixed a few docs. * multiagent batch * start * batch cleanup, prepare for multiagent batch * oops * generalize display * cleaning single vs. multi + comments * cleaning single vs. multi + comments * cleaning single vs. multi + comments * small update * small fixes * multiagent batch * small * bug
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I wonder why we didn't apply mask in GlobalAttention since we paded zeros on the right. I find there actually have a
applyMask
function but not used.Thank you.
The text was updated successfully, but these errors were encountered: