-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unable to run inference in linux environment #10921
Comments
Hi I have encountered this issue before, could you try to install oneapi 2024.0 and try again? Hope that it helps |
thank you for your response, I already install 2024.0 |
Hi @K-Alex13 , IPEX-LLM currently only supports oneapi 2024.0. You can activate oneapi 2024.0 and try again. |
the problem is still exist, if this is due to I forget to uninstall oneapi 2024.1 just continue install oneapi 2024.0? |
Hi I would suggest you to source 2024.0. try cd to /opt/intel and see the versions |
Hi K-Alex13, have you try If it does not work, would you mind sending the output of |
do we have a correct version package with oneapi 2024.1 and new bigdl? May I find it in http://ec2-52-27-27-201.us-west-2.compute.amazonaws.com/ipex-release.php?device=xpu&repo=us&release=stable
I can not install a correct version to run this.
please help me
The text was updated successfully, but these errors were encountered: