Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multi_head_attention_torch.py #24

Closed
WUHE-art opened this issue Oct 11, 2021 · 0 comments
Closed

multi_head_attention_torch.py #24

WUHE-art opened this issue Oct 11, 2021 · 0 comments

Comments

@WUHE-art
Copy link

WUHE-art commented Oct 11, 2021

作者您好,关于multi head attention代码中,self.coef=4,这里的coef=4的作用是什么呢?self.trans_dims = nn.Linear(dim, dim * self.coef)的输入输出是不同维度,但原始self attention中的Q经过现性变换前后维度相同,这是为什么呢。

@WUHE-art WUHE-art changed the title multi multi_head_attention_torch.py Oct 11, 2021
@WUHE-art WUHE-art closed this as completed Dec 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant