We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The newest version of Flash Attention is now available:
press release: https://crfm.stanford.edu/2023/07/17/flash2.html source code: https://github.com/Dao-AILab/flash-attention
This should be a drop-in upgrade for the current flash attention modules.
Writing code.
The text was updated successfully, but these errors were encountered:
@Jmkernes there's already a PR for this #624
Sorry, something went wrong.
No branches or pull requests
Feature request
The newest version of Flash Attention is now available:
press release: https://crfm.stanford.edu/2023/07/17/flash2.html
source code: https://github.com/Dao-AILab/flash-attention
Motivation
This should be a drop-in upgrade for the current flash attention modules.
Your contribution
Writing code.
The text was updated successfully, but these errors were encountered: