Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MASK tokens #31

Closed
JohannaOm opened this issue Mar 31, 2021 · 1 comment
Closed

MASK tokens #31

JohannaOm opened this issue Mar 31, 2021 · 1 comment

Comments

@JohannaOm
Copy link

JohannaOm commented Mar 31, 2021

Hello,

I have a question about MASK tokens in the input examples (Figure 3, paper). Which role do they play in training and predictions. Unfortunately, I could not find the explanation in the paper. What is the role of mask tokens between subject tokens and relation tokens (ConceptNet) when only object tokens need to be predicted?
Thank you very much in advance.

@atcbosselut
Copy link
Owner

They let you start the generation at the same position for each example in a batch, making it easier to do text generation in batch mode. In the repo, though, generation is done example-by-example, though, so they wouldn't be necessary.

The method would work w/o them, but if you train a model with them, you CAN'T remove them at evaluation/generation time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants