-
Notifications
You must be signed in to change notification settings - Fork 41
Issues: sustcsonglin/flash-linear-attention
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Quick question: Is there a non-causal optimized form of Flash Linear Attention?
todo
To be implemented
#31
opened Jul 8, 2024 by
yzeng58
Current FLA RWKV6 implementation has significant precision issues in pure bf16 mode
#29
opened Jul 5, 2024 by
howard-hou
ProTip!
Follow long discussions with comments:>50.