-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUDA based Pytorch Flash Attention is straight up non-functional / non-existent on Windows in *ALL* PyTorch versions above 2.1.2, opening this issue just to remove the weird vagueness around this. #3363
Comments
ERROR: Could not find a version that satisfies the requirement torch==2.1.2 (from versions: 2.2.0+cu121, 2.2.1+cu121, 2.2.2+cu121, 2.3.0+cu121) |
I dunno why it's not working for you, in any case assuming Python 3.11 these would be the exact wheels: Pip here should be exactly / specifically ComfyUI embedded Pip, also, of course, not your global system one if it exists |
@Akira13641 I came here cause I have the same problem but I am not using ComfyUI but the transformers library. Was trying to load a model in pytorch. I have python 3.12, maybe thats the problem |
I'm not sure all this stuff is yet available for Python 3.12 yeah |
might have something to do with flash attention 2 not yet officially supporting windows. it can be compiled though for instance see https://www.charlesherring.com/coding/compiling-pytorch-windows11 |
It straight up doesn't work, period, because it's not there, because they're for some reason no longer compiling PyTorch with it on Windows. As it stands currently, you WILL be indefinitely spammed with
UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455
unless you manually uninstall the Torch Comfy currently lists in its requirements.txt, and then runpip install torch==2.1.2 torchvision==0.16.2 torchaudio==2.1.2 --index-url https://download.pytorch.org/whl/cu121
to get back the last one that worked as expected.It's not at all clear to me why no one has yet pointed out that this isn't a mysterious or vague problem, it's a very obvious problem with a very clear sole cause .
The text was updated successfully, but these errors were encountered: