-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prebuilt x86 linux libs use EVEX encoding for AVX instructions, causing SIGILL #25
Comments
Thank you for the issue! I'm a bit hesitant to set |
There are options to enable AVX and AVX2 separately. While LLAMA_AVX512 is indeed disabled, by enabling LLAMA_NATIVE the flag |
I just released version 3.0 which upgraded to the newest llama.cpp version. There was a huge amount of change, so I'm not sure if this issue still applies. To reduce old issue, I'll close this one for now, but feel free to re-open if the problem still occurs. |
Attempting to use the pre-packaged
libjllama.so
on x86_64 Linux on CPUs that don't support AVX512 causes a SIGILL crash. I dumped the object code and found that an AVX2 instruction was encoded using the EVEX encoding scheme, which was added with the AVX512 extension. I don't have any CPUs that support AVX512 so I can't truly verify that this is the cause of the SIGILL crash, but it seems pretty likely. I think I've pinpointed the cause of this change to ggerganov/llama.cpp#3273, so we probably need to addLLAMA_NATIVE=OFF
for linux x86 nowThe text was updated successfully, but these errors were encountered: