Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

loss of paddding #34

Closed
liujiqiang999 opened this issue Mar 8, 2019 · 2 comments
Closed

loss of paddding #34

liujiqiang999 opened this issue Mar 8, 2019 · 2 comments

Comments

@liujiqiang999
Copy link

Hi, do we need to ignore padding's loss when we do back-translation? It seems that the code doesn't ignore the padding when we calculate loss. Thank you very much.

pred_mask = alen[:, None] < len1[None] - 1 # do not predict anything given the last target word

@glample
Copy link
Contributor

glample commented Mar 8, 2019

We do no? This line is supposed to set to 0 everything that is longer than sentences in x1, i.e. padding.

@liujiqiang999
Copy link
Author

Ah, right. I got. Thanks.

@glample glample closed this as completed Mar 9, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants