Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

addition of attention based techniques to pytorch #32529

Open
vainaixr opened this issue Jan 23, 2020 · 3 comments
Open

addition of attention based techniques to pytorch #32529

vainaixr opened this issue Jan 23, 2020 · 3 comments
Assignees
Labels
function request A request for a new function or the addition of new arguments/modes to an existing function. module: convolution Problems related to convolutions (THNN, THCUNN, CuDNN) oncall: transformer/mha Issues related to Transformers and MultiheadAttention triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@vainaixr
Copy link
Contributor

vainaixr commented Jan 23, 2020

馃殌 Feature

Stand alone self attention and attention augmented convolution networks perform better than standard convolution networks on image classification experiments, I think these two should be added to PyTorch.

Motivation

https://arxiv.org/abs/1904.09925
https://arxiv.org/abs/1906.05909

Pitch

something like,
nn.AugmentedConv()
nn.StandAloneAttention()

cc @zhangguanheng66

@vainaixr vainaixr changed the title addition of attention based techniques to pytorch addition of attention based techniques to pytorch Jan 23, 2020
@albanD albanD added module: operators needs research We need to decide whether or not this merits inclusion, based on research world triage review triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Jan 23, 2020
@ezyang ezyang added module: convolution Problems related to convolutions (THNN, THCUNN, CuDNN) and removed triage review labels Jan 27, 2020
@ezyang
Copy link
Contributor

ezyang commented Jan 27, 2020

cc @zhangguanheng66

@zhangguanheng66
Copy link
Contributor

We will include a self-attention module in pytorch soon.

@gchanan gchanan removed the needs research We need to decide whether or not this merits inclusion, based on research world label Feb 6, 2020
@zhangguanheng66
Copy link
Contributor

Add to the issue request post #32590

@cpuhrsch cpuhrsch added the oncall: transformer/mha Issues related to Transformers and MultiheadAttention label Mar 10, 2020
@mruberry mruberry added function request A request for a new function or the addition of new arguments/modes to an existing function. and removed module: operators (deprecated) labels Oct 10, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
function request A request for a new function or the addition of new arguments/modes to an existing function. module: convolution Problems related to convolutions (THNN, THCUNN, CuDNN) oncall: transformer/mha Issues related to Transformers and MultiheadAttention triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

7 participants