Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bert2Bert] allow bert2bert + relative embeddings #14324

Conversation

patrickvonplaten
Copy link
Contributor

@patrickvonplaten patrickvonplaten commented Nov 8, 2021

What does this PR do?

Fixes #14010
Everything is explained in #14010. IMO, Bert2Bert like models should not (and also cannot really) make use of positional bias in the cross_attention_layers. The PR forces cross attention layers to always use "absolute" position encodings.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing! IMO, we can make this better by adding an init argument to the attention layer with this flag.

Comment on lines 455 to 459
# cross attention cannot have relative position embeddings
cross_attention_config = copy.deepcopy(config)
cross_attention_config.position_embedding_type = "absolute"

self.crossattention = BertAttention(cross_attention_config)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit hackish. If we start to turn this argument on and off, maybe it should be an init argument? It can default to None in which case we take the config value (so this is 100% backward compatible).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah agree

@qqaatw
Copy link
Contributor

qqaatw commented Nov 8, 2021

Thanks @patrickvonplaten for fixing this! It looks correct that relative position embeddings should not be used on cross-attention layers.

README_ko.md Outdated Show resolved Hide resolved
Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, LGTM!

@LysandreJik
Copy link
Member

Merging now to prevent merge conflicts

@LysandreJik LysandreJik merged commit e81d8d7 into huggingface:master Nov 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Bert: relative_key position embedding causes error in encoder-decoder setup
4 participants