-
Notifications
You must be signed in to change notification settings - Fork 29.4k
LlamaAttention forward function type hint is incorrect from new Branch #38855
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…gface#38739" This reverts commit 4f0010a.
@Rocketknight1 same error again |
Hey @ArkVex , There are many models which inherit from llama modules. So, running |
It's working fine for me. Can't you see any changes in your branch after running. Might be that the changes have been applied and in the next run there is no diff. Otherwise install |
Hi @ArkVex, the problem is that |
If it helps, the reset command(s) may look something like:
(X-Ref to link this to the last MR: #38795) |
Hi, this PR fixes a small issue in the LlamaAttention class. The return type in the forward method currently shows three values, but the function actually returns only two. This seems to have been missed during the attention refactor (possibly in PR #35235).
I’ve updated the type hint to reflect the actual return values, just to avoid confusion for anyone reading or using the code. Let me know if any other changes are needed. Happy to help!