Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question related to _yarn_linear_ramp_mask #60

Open
chizhang118 opened this issue Apr 9, 2024 · 1 comment
Open

Question related to _yarn_linear_ramp_mask #60

chizhang118 opened this issue Apr 9, 2024 · 1 comment

Comments

@chizhang118
Copy link

chizhang118 commented Apr 9, 2024

I have a question for _yarn_linear_ramp_mask implementation, linear_func = (torch.arange(dim, dtype=torch.float32) - min) / (max - min). For this part, the calculation is based on the dimension rather than num of rotation, but when I checked the paper of defining the ramp function, it seems the r, alpha, beta are all relate to num of rotation rather than dimension since the definition of r(d) = L/lambda, which is the num of rotation comparing with alpha and beta.

So is the implementation the same as the paper statement?

Could anyone help me understand this part?

@disperaller
Copy link

it is just a mask, not the actual r values

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants