Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[M2M100] fix positional embeddings #10590

Merged
merged 6 commits into from Mar 8, 2021

Conversation

patil-suraj
Copy link
Contributor

What does this PR do?

The torchscript tests for M2M100 are failing on master. This is because the weights in M2M100SinusoidalPositionalEmbedding are initially not on the same device as the rest of the parameters.

The PR makes the weights as nn.Parameter so they'll be on the same device.

@patil-suraj patil-suraj changed the title Fix m2m100 [M2M100] fix positional embeddings Mar 8, 2021
Comment on lines +1163 to +1165
_keys_to_ignore_on_save = [
r"model.encoder.embed_positions.weights",
r"model.decoder.embed_positions.weights",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since M2M100 uses sinusoidal positional embeddings, we don't need to save the pos embed weights.

decoder_input_ids = ids_tensor([self.batch_size, self.seq_length], self.vocab_size)

# we need to clamp the input ids here to avoid having pad token in between
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great! Thank for the very in-detail explanation!

@patil-suraj patil-suraj merged commit 2a737bf into huggingface:master Mar 8, 2021
@patil-suraj patil-suraj deleted the fix-m2m100 branch March 8, 2021 10:37
Iwontbecreative pushed a commit to Iwontbecreative/transformers that referenced this pull request Jul 15, 2021
* fix tests

* emb should be a parameter

* fix positional embeddings

* fix make_weights

* don't save pos embeds

* add comment to describe the clamping
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants