-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama fails to start in CPU only mode #3526
Comments
Please provide the full log of ollama serve ! |
This is the full log of Ollama from journalctl. It doesn't proceed after this, that's the problem. |
I am facing the exact same error. The only thing we did is try to delete / create custom models from model files and the error started to appear. No additional logs about the problem is printed. |
@Tamaya31 Were you able to solve this? |
@navr32 I enabled
|
Was able to find a workaround for this issue. Hope it helps someone in need. Since none of the CLI commands work in this case Eg: |
Sadly no. I went with the docker image as an alternative for now. |
In the container, `OLLAMA_HOST` is set by default to `0.0.0.0` (ref: [Dockerfile#L137]), which is fine when starting the server. However, as a client, it is necessary to use `127.0.0.1` or `localhost` for requests. fix: ollama#3521 ollama#1337 maybe fix: ollama#3526 [Dockerfile#L137]: https://github.com/ollama/ollama/blob/7e432cdfac51583459e7bfa8fdd485c74a6597e7/Dockerfile#L137 Signed-off-by: Kevin Cui <bh@bugs.cc>
In the container, `OLLAMA_HOST` is set by default to `0.0.0.0` (ref: [Dockerfile#L137]), which is fine when starting the server. However, as a client, it is must to use `127.0.0.1` or `localhost` for requests. fix: ollama#3521 ollama#1337 maybe fix: ollama#3526 [Dockerfile#L137]: https://github.com/ollama/ollama/blob/7e432cdfac51583459e7bfa8fdd485c74a6597e7/Dockerfile#L137 Signed-off-by: Kevin Cui <bh@bugs.cc>
Are you still seeing this failure with the latest version 0.1.33? If so, can you share an updated server log? The log output you shared above ending with these 2 lines looks like a typical startup, with the server now waiting for requests from the client
|
Hey @vishnu-dev , I think everything should be working correctly. It looks like you were connecting to it on the wrong port? If you're still seeing this, please share the server logs as @dhiltgen mentioned and we can reopen. |
What is the issue?
Ollama fails to start properly when using in a system with only CPU mode. This happened after I upgraded to latest version i.e. 0.1.30 using the curl command as in the docs. I had to downgrade by uninstalling and installing specific version as in docs, but still the problem persists. Tried with docker after this mishap, still doesn't work.
Following are the logs using
journalctl
:All commands fail, for example:
What did you expect to see?
Expected normal functioning and launch. Updates shouldn't affect the system.
Steps to reproduce
All steps on a CPU only configuration:
Regular working version
ollama - v
--> 0.1.25Update ollama
curl -fsSL https://ollama.com/install.sh | sh
Check version now
ollama -v
--> 0.1.30 --> Doesn't start servingAre there any recent changes that introduced the issue?
Update ollama
curl -fsSL https://ollama.com/install.sh | sh
OS
Linux
Architecture
x86
Platform
No response
Ollama version
0.1.25
GPU
No response
GPU info
No response
CPU
Intel
Other software
CPU: Intel(R) Xeon(R) Platinum 8272CL CPU @ 2.60GHz
Supports:
The text was updated successfully, but these errors were encountered: