Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Install torch nightly for flash attention #11

Merged
merged 4 commits into from
May 5, 2023

Conversation

carmocca
Copy link
Contributor

@carmocca carmocca commented May 5, 2023

Fixes #2

@carmocca carmocca force-pushed the carmocca/flash-attn-torch-nigthly branch from 8ceaf53 to 46819f6 Compare May 5, 2023 16:29
@carmocca carmocca force-pushed the carmocca/flash-attn-torch-nigthly branch from b62173c to b13151a Compare May 5, 2023 16:37
@carmocca carmocca marked this pull request as ready for review May 5, 2023 16:55
@carmocca carmocca force-pushed the carmocca/flash-attn-torch-nigthly branch from e481bcd to ff335b9 Compare May 5, 2023 17:01
@carmocca carmocca force-pushed the carmocca/flash-attn-torch-nigthly branch from ff335b9 to 0645cc5 Compare May 5, 2023 17:02
@carmocca carmocca force-pushed the carmocca/flash-attn-torch-nigthly branch from 0645cc5 to 631e66f Compare May 5, 2023 17:02
@carmocca carmocca mentioned this pull request May 5, 2023
@carmocca carmocca merged commit 8091f45 into main May 5, 2023
@carmocca carmocca deleted the carmocca/flash-attn-torch-nigthly branch May 5, 2023 17:26
aniketmaurya pushed a commit to aniketmaurya/install-lit-gpt that referenced this pull request Jul 5, 2023
@carmocca carmocca self-assigned this Nov 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Flash attention support
1 participant