-
Notifications
You must be signed in to change notification settings - Fork 111
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NameError: name 'MHA' is not defined #10
Comments
Did you install flash attention? |
Yep - realized it’s likely an issue with the GPU I am using. Trying on an
A100
…On Wed, Feb 28, 2024 at 2:58 PM Michael Poli ***@***.***> wrote:
Did you install flash attention?
—
Reply to this email directly, view it on GitHub
<#10 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AP7TXHX46RLZYOPOPS727XLYV6D5PAVCNFSM6AAAAABD4Y65MKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNRZG42TCOJYGE>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Was this resolved? |
I'm getting the same error on Apple M3, any solutions? |
I have the following issue with A100 |
I encounter this issue as well, it seems that flash_attn has some problems. |
I can't solve this problem, using RTX3090. |
It sounds like these are all flash attention compatibility issues with specific gpus |
Hi! After experiment, a conclusion can be get that there is nothing to do with GPU, at least, RTX3090 doesn't cause this problem. |
Dear all, this problem finally solved by changing flash_attn version. |
Same issue for me using cuda_11.7.r11.7 |
The problem for me was the triton package was not installed. This solved the problem for me.
Also, I checked here https://github.com/Dao-AILab/flash-attention/releases and downloaded the right
|
I have similar problem on a V100. I installed several versions of pytorch and flash_attn and triton packages. But I receive the following error:
Any suggestion how to fix this problem? |
Sorry, I have no idea what is going on. For me just installing triton have
solved the problem
…On Tue, Apr 30, 2024 at 11:51 AM Hadi ***@***.***> wrote:
I have similar problem on a V100. I installed several versions of pytorch
and flash_attn and triton packages. But I receive the following error:
RuntimeError: Internal Triton PTX codegen error:
ptxas /tmp/compile-ptx-src-b231af, line 147; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 147; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher ptxas /tmp/compile-ptx-src-b231af, line 149; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 149; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 160; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 160; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 162; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 162; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 181; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 181; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 183; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 183; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 197; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 197; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 199; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 199; error : Feature 'cvt with .f32.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 233; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 233; error : Feature 'cvt.bf16.f32' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 235; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 235; error : Feature 'cvt.bf16.f32' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 243; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 243; error : Feature 'cvt.bf16.f32' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 245; error : Feature '.bf16' requires .target sm_80 or higher
ptxas /tmp/compile-ptx-src-b231af, line 245; error : Feature 'cvt.bf16.f32' requires .target sm_80 or higher
ptxas fatal : Ptx assembly aborted due to errors
Any suggestion how to fix this problem?
—
Reply to this email directly, view it on GitHub
<#10 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AYCKKOLLMGRAKECNTNUEXQ3Y77RZ5AVCNFSM6AAAAABD4Y65MKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOBWGU3TCMJUG4>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
I'm getting this too. did you find a solution? |
One suggestion from flash attention team was to use nightly build of everything. That solves some problems with the flash attention. But I still cannot use evo as I receive similar errors:
Any idea how can I fix it? |
I have the same problem, any solutions? |
Some of these issues appear to be caused by triton not being installed. We've added triton to requirements.txt. Also see #23 |
Trying to load the model from HuggingFace across many environments (Jupyter Notebook, local on MacOS, all python 3.10 and latest version of transformers) yields the error:
The code I'm using is the sample code from your HuggingFace example!
The text was updated successfully, but these errors were encountered: