Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA based Pytorch Flash Attention is straight up non-functional / non-existent on Windows in *ALL* PyTorch versions above 2.1.2, opening this issue just to remove the weird vagueness around this. #3363

Open
Akira13641 opened this issue Apr 27, 2024 · 5 comments

Comments

@Akira13641
Copy link

It straight up doesn't work, period, because it's not there, because they're for some reason no longer compiling PyTorch with it on Windows. As it stands currently, you WILL be indefinitely spammed with UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455 unless you manually uninstall the Torch Comfy currently lists in its requirements.txt, and then run pip install torch==2.1.2 torchvision==0.16.2 torchaudio==2.1.2 --index-url https://download.pytorch.org/whl/cu121 to get back the last one that worked as expected.

It's not at all clear to me why no one has yet pointed out that this isn't a mysterious or vague problem, it's a very obvious problem with a very clear sole cause .

@Akira13641 Akira13641 changed the title CUDA based Pytorch Flash Attention is straight up non-functional / non-existent on Windows in *ALL* PyTorch versions above 2,1.2, opening this issue just to remove the weird vagueness haround this. CUDA based Pytorch Flash Attention is straight up non-functional / non-existent on Windows in *ALL* PyTorch versions above 2.1.2, opening this issue just to remove the weird vagueness haround this. Apr 27, 2024
@Akira13641 Akira13641 changed the title CUDA based Pytorch Flash Attention is straight up non-functional / non-existent on Windows in *ALL* PyTorch versions above 2.1.2, opening this issue just to remove the weird vagueness haround this. CUDA based Pytorch Flash Attention is straight up non-functional / non-existent on Windows in *ALL* PyTorch versions above 2.1.2, opening this issue just to remove the weird vagueness around this. Apr 27, 2024
@DollarAkshay
Copy link

and then run pip install torch==2.1.2 torchvision==0.16.2 torchaudio==2.1.2 --index-url https://download.pytorch.org/whl/cu121 to get back the last one that worked as expected.

ERROR: Could not find a version that satisfies the requirement torch==2.1.2 (from versions: 2.2.0+cu121, 2.2.1+cu121, 2.2.2+cu121, 2.3.0+cu121)
ERROR: No matching distribution found for torch==2.1.2

@Akira13641
Copy link
Author

and then run pip install torch==2.1.2 torchvision==0.16.2 torchaudio==2.1.2 --index-url https://download.pytorch.org/whl/cu121 to get back the last one that worked as expected.

ERROR: Could not find a version that satisfies the requirement torch==2.1.2 (from versions: 2.2.0+cu121, 2.2.1+cu121, 2.2.2+cu121, 2.3.0+cu121) ERROR: No matching distribution found for torch==2.1.2

I dunno why it's not working for you, in any case assuming Python 3.11 these would be the exact wheels:
https://download.pytorch.org/whl/cu121/torch-2.1.2%2Bcu121-cp311-cp311-win_amd64.whl
https://download.pytorch.org/whl/cu121/torchvision-0.16.2%2Bcu121-cp311-cp311-win_amd64.whl
https://download.pytorch.org/whl/cu121/torchaudio-2.1.2%2Bcu121-cp311-cp311-win_amd64.whl

Pip here should be exactly / specifically ComfyUI embedded Pip, also, of course, not your global system one if it exists

@DollarAkshay
Copy link

DollarAkshay commented Apr 29, 2024

@Akira13641 I came here cause I have the same problem but I am not using ComfyUI but the transformers library. Was trying to load a model in pytorch.

I have python 3.12, maybe thats the problem

@Akira13641
Copy link
Author

@Akira13641 I came here cause I have the same problem but I am not using ComfyUI but the transformers library. Was trying to load a model in pytorch.

I have python 3.12, maybe thats the problem

I'm not sure all this stuff is yet available for Python 3.12 yeah

@LucisVivae
Copy link

might have something to do with flash attention 2 not yet officially supporting windows. it can be compiled though for instance see https://www.charlesherring.com/coding/compiling-pytorch-windows11

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants