Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

assert Flash Attention doesn't get arbitrary mask #53

Merged

Conversation

mayank31398
Copy link

@mayank31398 mayank31398 commented May 22, 2023

Since FlashAttention only works with no mask or causal mask, its better to throw an error here.

@janEbert
Copy link

You also mentioned --reset-position-ids being a problem, does this also need to be handled?

@mayank31398
Copy link
Author

I don't think that needs to be handled.
It should work with any position ids.

@RaymondLi0 RaymondLi0 merged commit beaf2f2 into bigcode-project:multi-query-attention May 23, 2023
@mayank31398 mayank31398 deleted the error-reset branch May 24, 2023 07:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants