Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

does flash attention support attention mask? #409

Closed
tiandiao123 opened this issue Aug 1, 2023 · 2 comments
Closed

does flash attention support attention mask? #409

tiandiao123 opened this issue Aug 1, 2023 · 2 comments

Comments

@tiandiao123
Copy link

Hello friends:
I am wondering whether current flash attention version supports attention mask? Since I didn't see some arguments in fwd function, so I am really curious. Thank you!
Best

@tridao
Copy link
Contributor

tridao commented Aug 1, 2023

No it doesn't support attention mask. We might implement that in the future when we have bandwidth, but right now we have some other priorities (e.g. H100 optimization).

@tiandiao123
Copy link
Author

No it doesn't support attention mask. We might implement that in the future when we have bandwidth, but right now we have some other priorities (e.g. H100 optimization).

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants