-
Notifications
You must be signed in to change notification settings - Fork 1k
Pull requests: Dao-AILab/flash-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
[Draft] support qk head_dim different from vo head_dim
#980
opened Jun 6, 2024 by
defei-coder
Loading…
Add local version identifier to package metadata for pre-built wheels
#856
opened Feb 28, 2024 by
yundai424
Loading…
Animations for Flash Attention, Flash Attention2, and Standard Attention
#736
opened Dec 24, 2023 by
LuisAVasquez
Loading…
feat(attention): add Bi-Directional MLM attention model
#721
opened Dec 12, 2023 by
TamirFriedman-RecoLabs
•
Draft
[fix bug] Llama-2-70B crashed when prompt_len < ngroups
#708
opened Dec 7, 2023 by
li2haipeng
Loading…
Support returning attention weights in naive attention modules
#589
opened Oct 4, 2023 by
kklemon
Loading…
1 task
Previous Next
ProTip!
no:milestone will show everything without a milestone.