-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when install flash-attn #36
Comments
Hi @Richar-Du, thank you for your interest in our work.
Can you provide: (1) the full error log, and wrap with ``` as well; (2) your system environment, including the OS, CUDA version, and GPU type? |
The following is my full error log (llava) E:\LLaVA>pip install flash-attn × python setup.py bdist_wheel did not run successfully.
note: This error originates from a subprocess, and is likely not a problem with pip. |
Hi it seems that it's a Windows machine, and it cannot find the CUDA/GPU? I am not familiar with compiling these on Windows so I may not be able to offer much help on this. One thing I would like to mention is that the flash attention is only needed for training. So you may go ahead without installing the flash-attn and run the demo/inference. |
I want to run the training code and the error is: The OS is CentOS Linux release 7.6.1810 (Core) x86_64, the CUDA is 11.4, and the GPU is NVIDIA A100-SXM4-80GB. Thanks in advance. |
Hi @Richar-Du, sorry I just saw your comment. I am not sure if this is the cause, as it is an issue with the
|
@Richar-Du Maybe you can checkout your NVCC version, you'd better use NVCC>11.7, hope to help you! |
I meet the same problem. Here's how to solve it. |
When I run
pip intall flash-attn
, it raises an error:ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects
However, I have run
pip install -e .
and successfully installed llava. Do you know how to solve this problem?The text was updated successfully, but these errors were encountered: