-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Building wheel for flash-attn (pyproject.toml) did not run successfully #224
Comments
There should be a longer log than that, do you have it? |
|
Seems to be an incompatibility with g++ version, thanks @Jingsong-Yan ! |
Hi, @jesswhitts how to determine which g++ version is compatible? |
I got the following error which states the compatible version: RuntimeError: The current installed version of g++ (4.8.5) is less than the minimum required version by CUDA 11.6 (6.0.0). Please make sure to use an adequate version of g++ (>=6.0.0, <12.0). |
Building wheels for collected packages: flash-attn × python setup.py bdist_wheel did not run successfully.
note: This error originates from a subprocess, and is likely not a problem with pip. |
Building wheel for flash-attn (setup.py) ... error × python setup.py bdist_wheel did not run successfully.
File "/root/miniconda3/envs/Modelscope/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1909, in _run_ninja_build note: This error originates from a subprocess, and is likely not a problem with pip. |
不行,还是报错 |
same error; g++ 10.2 |
still same error. I have g++ 11.4 in an ubuntu system with CUDA 11.5 |
torch 2.1.0 执行: apt-get update && apt-get install -y g++ 报错如下: Building wheels for collected packages: flash-attn × python setup.py bdist_wheel did not run successfully.
note: This error originates from a subprocess, and is likely not a problem with pip. |
conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit |
我遇到了同样的问腿,感觉可能是重定向下载某个东西时卡住了 git clone git@github.com:Dao-AILab/flash-attention.git 注意这里会从出现错误提示flash-attention/csrc/cutlass找不到,git下载cutlass失败 重新运行python setup.py install 就可以编译成功了 |
Hi @shuyhere , After I try your solution and use flash-atten ver.1.0.5 , it works. (Remark : I use ver.1.0.5 because I use T4 GPU.) |
install the pre-build wheel list in release page works for me, in my case: |
It works for me. |
It works for me. Thanks! |
Thank you! It works for me. |
Thank you! It works for me. |
This is great, works for me. Thanks a lot! |
Thank you! it work for me |
Thank you! It works for me. |
this works, thanks! |
I have this issue on Windows, any fix for that? |
Hello,
I am trying to install via pip into a conda environment, with A100 GPU, cuda version 11.6.2.
I get the following, not very informative, error:
Building wheels for collected packages: flash-attn
error: subprocess-exited-with-error
× Building wheel for flash-attn (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
Building wheel for flash-attn (pyproject.toml) ... error
ERROR: Failed building wheel for flash-attn
Failed to build flash-attn
ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects
Many thanks,
Jess
The text was updated successfully, but these errors were encountered: