Skip to content

Latest commit

 

History

History
116 lines (80 loc) · 2.01 KB

linux.md

File metadata and controls

116 lines (80 loc) · 2.01 KB

Ollama on Linux

Install

Install Ollama running this one-liner:

curl https://ollama.ai/install.sh | sh

Manual install

Download the ollama binary

Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:

sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama

Adding Ollama as a startup service (recommended)

Create a user for Ollama:

sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama

Create a service file in /etc/systemd/system/ollama.service:

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3

[Install]
WantedBy=default.target

Then start the service:

sudo systemctl daemon-reload
sudo systemctl enable ollama

Install CUDA drivers (optional – for Nvidia GPUs)

Download and install CUDA.

Verify that the drivers are installed by running the following command, which should print details about your GPU:

nvidia-smi

Start Ollama

Start Ollama using systemd:

sudo systemctl start ollama

Update

Update ollama by running the install script again:

curl https://ollama.ai/install.sh | sh

Or by downloading the ollama binary:

sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama

Viewing logs

To view logs of Ollama running as a startup service, run:

journalctl -u ollama

Uninstall

Remove the ollama service:

sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service

Remove the ollama binary from your bin directory (either /usr/local/bin, /usr/bin, or /bin):

sudo rm $(which ollama)

Remove the downloaded models and Ollama service user:

sudo rm -r /usr/share/ollama
sudo userdel ollama