Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is attention normalization method right? #23

Closed
candlewill opened this issue Jan 9, 2018 · 2 comments
Closed

Is attention normalization method right? #23

candlewill opened this issue Jan 9, 2018 · 2 comments
Labels

Comments

@candlewill
Copy link

candlewill commented Jan 9, 2018

@r9y9 To my understanding, x = x * (s * math.sqrt(1.0 / s)) == x = x * (math.sqrt(s)). Is this right?

If it is right, why we need to multiply x with math.sqrt(s) instead of divided by math.sqrt(s) ?

x = x * (s * math.sqrt(1.0 / s))

@r9y9
Copy link
Owner

r9y9 commented Jan 9, 2018

I think you are right. To note, this is because the code was adapted from fairseq https://github.com/facebookresearch/fairseq-py/blob/9430544a3b95dbb2e3b86c303776e37f18188ad3/fairseq/models/fconv.py#L139

They use the notation m * sqrt(1/m) in their paper (see sec. 3.4 of https://arxiv.org/pdf/1705.03122.pdf) and I guess this is why the code as is.

@stale
Copy link

stale bot commented May 30, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix label May 30, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants