You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I try to run the model with vLLM, I get this error: ModuleNotFoundError: No module named ‘triton’. My local environment is macOS and I have successfully installed vLLM.
I don't think I need to rely on nvidia to run the model through the cpu, and the documentation on cpu booting doesn't mention any dependencies on NVIDIA. So do I have to install the triton module?
Before submitting a new issue...
Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
The text was updated successfully, but these errors were encountered:
Your current environment
How would you like to use vllm
Hi there!
When I try to run the model with vLLM, I get this error:
ModuleNotFoundError: No module named ‘triton’.
My local environment is macOS and I have successfully installed vLLM.(base) ➜ ~ vLLM -v INFO 03-16 22:15:08 [__init__.py:256] Automatically detected platform cpu. 0.7.4.dev483+gd1ad2a57
I don't think I need to rely on nvidia to run the model through the cpu, and the documentation on cpu booting doesn't mention any dependencies on NVIDIA. So do I have to install the triton module?
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: