Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Script MultiheadAttention #1524

Closed
wants to merge 2 commits into from
Closed

Conversation

cndn
Copy link
Contributor

@cndn cndn commented Dec 19, 2019

Summary:
Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

  1. Add types
  2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
  3. There might be opportunities to make assertions and annotations cleaner.

Differential Revision: D18772594

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18772594

cndn added a commit to cndn/translate that referenced this pull request Jan 10, 2020
Summary:
Pull Request resolved: facebookresearch/fairseq#1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Differential Revision: D18772594

fbshipit-source-id: 81b830b16fbaa9c6fc34dee0672054f146060ea4
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18772594

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18772594

cndn added a commit to cndn/translate that referenced this pull request Jan 14, 2020
Summary:
Pull Request resolved: pytorch#681

Pull Request resolved: facebookresearch/fairseq#1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Differential Revision: D18772594

fbshipit-source-id: 4353d522d244b1508190d33ca5be6f2299e8442c
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18772594

Differential Revision: D18799003

fbshipit-source-id: a7e088997e9d246f1216b1ce0e4deff43354c9a7
Summary:
Pull Request resolved: pytorch/translate#681

Pull Request resolved: facebookresearch#1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Differential Revision: D18772594

fbshipit-source-id: 8b8b87f0e74f4afb863b15fc4172482b640f6197
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18772594

cndn added a commit to cndn/translate that referenced this pull request Jan 16, 2020
Summary:
Pull Request resolved: pytorch#681

Pull Request resolved: facebookresearch/fairseq#1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Differential Revision: D18772594

fbshipit-source-id: 5c21d7d84db1320201f486015bb91469006ffd95
facebook-github-bot pushed a commit that referenced this pull request Jan 22, 2020
Summary:
Pull Request resolved: fairinternal/fairseq-py#1002

Pull Request resolved: pytorch/translate#681

Pull Request resolved: #1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Reviewed By: myleott

Differential Revision: D18772594

fbshipit-source-id: 377aef4bbb7ef51da5b6bac9a87a6f7b03b16fe1
moussaKam pushed a commit to moussaKam/language-adaptive-pretraining that referenced this pull request Sep 29, 2020
Summary:
Pull Request resolved: fairinternal/fairseq-py#1002

Pull Request resolved: pytorch/translate#681

Pull Request resolved: facebookresearch#1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Reviewed By: myleott

Differential Revision: D18772594

fbshipit-source-id: 377aef4bbb7ef51da5b6bac9a87a6f7b03b16fe1
yzpang pushed a commit to yzpang/gold-off-policy-text-gen-iclr21 that referenced this pull request Feb 19, 2021
Summary:
Pull Request resolved: fairinternal/fairseq-py#1002

Pull Request resolved: pytorch/translate#681

Pull Request resolved: facebookresearch/fairseq#1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Reviewed By: myleott

Differential Revision: D18772594

fbshipit-source-id: 377aef4bbb7ef51da5b6bac9a87a6f7b03b16fe1
yzpang pushed a commit to yzpang/gold-off-policy-text-gen-iclr21 that referenced this pull request Feb 19, 2021
Summary:
Pull Request resolved: fairinternal/fairseq-py#1002

Pull Request resolved: pytorch/translate#681

Pull Request resolved: facebookresearch/fairseq#1524

Make fairseq MultiheadAttention scriptable. Looking for feedbacks.

1. Add types
2. Move incremental state management logic from util functions to initializers. TorchScript in general doesn't support global dict. As a result modules with multihead attention in it would assign itself fairseq_instance_id in the initializer.
3. There might be opportunities to make assertions and annotations cleaner.

Reviewed By: myleott

Differential Revision: D18772594

fbshipit-source-id: 377aef4bbb7ef51da5b6bac9a87a6f7b03b16fe1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants