Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncertainty in FlashAttention2 backward #414

Closed
ezioliao opened this issue Aug 2, 2023 · 2 comments
Closed

Uncertainty in FlashAttention2 backward #414

ezioliao opened this issue Aug 2, 2023 · 2 comments

Comments

@ezioliao
Copy link

ezioliao commented Aug 2, 2023

When I ran the unit test "test_flash_atten.py" for the function "flash_attn_varlen_func", I noticed that the values of dq in "flash_attn_varlen_func" during the backward process has slight distinct in different test(with a fixed random seed and the same input), but dk and dv are exactly same. During the forward pass, the output for the same input remained consistent. Does this indicate that in FA2 the forward is deterministic, but the backward is not? What could be the possible reasons for this uncertainty?

@tridao
Copy link
Contributor

tridao commented Aug 3, 2023

Yes, backward pass is not deterministic because we use atomic adds.

@ezioliao
Copy link
Author

ezioliao commented Aug 4, 2023

Thanks for your answer! :)

@ezioliao ezioliao closed this as completed Aug 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants