Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

allow for a configurable ollama model storage directory #897

Merged
merged 5 commits into from Oct 27, 2023

Conversation

BruceMacD
Copy link
Contributor

@BruceMacD BruceMacD commented Oct 24, 2023

  • set OLLAMA_MODELS in the environment that ollama is running in to change where models are stored
  • update docs
$ OLLAMA_MODELS=/Users/bruce/ollama_models ollama serve
# store models in /Users/bruce/ollama_models

Resolves #228 #153

I'll hold off on merging this until #847 is in to avoid causing that PR pain.

server/images.go Outdated Show resolved Hide resolved
app/src/index.ts Outdated Show resolved Hide resolved
llama.0x16d107000.log Outdated Show resolved Hide resolved
server/images.go Outdated Show resolved Hide resolved
config/environment.go Outdated Show resolved Hide resolved
docs/faq.md Outdated Show resolved Hide resolved
docs/faq.md Outdated Show resolved Hide resolved
config/environment.go Outdated Show resolved Hide resolved
docs/faq.md Outdated Show resolved Hide resolved
docs/faq.md Outdated Show resolved Hide resolved
docs/faq.md Outdated Show resolved Hide resolved
docs/import.md Outdated Show resolved Hide resolved
server/modelpath.go Outdated Show resolved Hide resolved
@jmorganca
Copy link
Member

This is looking great. Given the main use case here is offloading model storage, we should update it to OLLAMA_MODELS and decide where the models

- set OLLAMA_MODELS in the environment that ollama is running in to change where model files are stored
- update docs

Co-Authored-By: Jeffrey Morgan <jmorganca@gmail.com>
Co-Authored-By: Jay Nakrani <dhananjaynakrani@gmail.com>
Co-Authored-By: Akhil Acharya <akhilcacharya@gmail.com>
Co-Authored-By: Sasha Devol <sasha.devol@protonmail.com>
@BruceMacD BruceMacD changed the title allow for a configurable ollama home directory allow for a configurable ollama model storage directory Oct 26, 2023
@@ -18,10 +18,6 @@ import (
"github.com/jmorganca/ollama/version"
)

const DefaultHost = "127.0.0.1:11434"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unused

docs/faq.md Outdated Show resolved Hide resolved
server/modelpath.go Outdated Show resolved Hide resolved
@BruceMacD BruceMacD merged commit 5c3491f into main Oct 27, 2023
@BruceMacD BruceMacD deleted the brucemacd/ollama-home branch October 27, 2023 14:20
@jikkuatwork
Copy link
Contributor

jikkuatwork commented Nov 5, 2023

hello, is the change merged and available in the latest release? Even after having the environment variable set in linux OLLAMA_MODELS the models are being downloaded to ~/.ollama

Any ideas?

Note: I am trying to save the models to an external drive which is in NTFS. I don't think it should matter, but it gave me an error when I tried to move the blobs as it contained :s

@BruceMacD
Copy link
Contributor Author

@jikkuatwork are you running Ollama as a system service (ie: how it is installed by default using the linux install script)?

If so you'll need to add the environment variable to the system service and restart it. Here's what that looks like:

  1. Open the systemd service file to edit it.
sudo nano /etc/systemd/system/ollama.service
  1. Add the new environment variable.
[Service]
...
Environment="PATH=$PATH"
Environment="OLLAMA_MODELS=/path/to/models"
...
  1. Reload the systemd daemon.
sudo systemctl daemon-reload
  1. Restart the service.
sudo systemctl restart ollama

@jikkuatwork
Copy link
Contributor

I am just copying the executable to /usr/bin & I am setting OLLAMA_MODELS in my zshrc. I have restarted the machine a few times after this. When I pull a new model it still downloads to ~/.ollama

@BruceMacD
Copy link
Contributor Author

Hmm, that should work, try running ollama serve with the environment variable directly OLLAMA_MODELS=/path/to/models ollama serve to see if the environment variable works at all. Also could be worth checking you have the most recent version of Ollama.

@yatinlala
Copy link

yatinlala commented Nov 7, 2023

Hmm, that should work, try running ollama serve with the environment variable directly OLLAMA_MODELS=/path/to/models ollama serve to see if the environment variable works at all. Also could be worth checking you have the most recent version of Ollama.

I'm encountering the same problem as @jikkuatwork on v0.1.8. Explicitly setting the variable doesn't seem to be working.

~ ❯ OLLAMA_MODELS=~/.local/share/ollama ollama serve
Couldn't find '/home/lala/.ollama/id_ed25519'. Generating new private key.

Update: It seems like an ollama folder is correctly being created and used to store blobs at OLLAMA_MODELS. The home directory .ollama folder is only being used to store the ssh key pair. Seems like this might be intended behaviour — if it is, I really hope it can be changed. Looks like a bug according to the PR description, hopefully it can be fixed.

@BruceMacD
Copy link
Contributor Author

Ah, that is it, thanks @yatinlala, the part about public keys is a typo from the original behavior in this issue. I'll edit that and leave it for #228

@Crypto69
Copy link

Does this work around work when running the macOS ollama app? I tried setting the environment in my shell and command line.

~ ollama list
NAME                            	ID          	SIZE  	MODIFIED
deepseek-coder:33b              	2941d6ab92f3	18 GB 	3 weeks ago
deepseek-coder:33b-instruct-q2_K	92b1e8ffe46e	14 GB 	3 weeks ago
deepseek-coder:6.7b             	72be2442d736	3.8 GB	3 weeks ago
deepseek-coder:latest           	140a485970a6	776 MB	3 weeks ago
llama2:latest                   	fe938a131f40	3.8 GB	3 weeks ago
llama2-uncensored:latest        	44040b922233	3.8 GB	3 weeks ago
mistral:latest                  	1ab49bc0b6a8	4.1 GB	14 minutes ago
wizard-vicuna-uncensored:13b    	6887722b6618	7.4 GB	3 weeks ago
wizardlm-uncensored:13b-llama2  	886a369d74fc	7.4 GB	3 weeks ago
~ echo $OLLAMA_MODELS
/Volumes/ExternalHD/ollama-models
~ ollama run codellama
pulling manifest
pulling manifest
pulling manifest
pulling manifest
pulling manifest

However the model is still getting downloaded to ~/.ollama/models

How do I get this working on macOS?

@BananaAcid
Copy link

Great addition - Should probably be added to the README.md

@jikkuatwork
Copy link
Contributor

Sadly, its still giving me error. This is how I am running it:

OLLAMA_MODELS=/external/ntfs/location ollama serve

But, when I try to pull a model, it gives me an error:

ollama pull phi
pulling manifest
Error: open /external/ntfs/location/blobs/sha256:bd608f9545597ea3278b78038943059d1c29c62f3ca02c86523014f3a8c7a7f1-partial-0: invalid argument

@elianemaalouf
Copy link

I am also having exactly same problem as jikkuatwork,
I am on ubuntu and have updated ollama.
Setting OLLAMA_MODELS env variable in /etc/systemd/system/ollama.service does not work.
using OLLAMA_MODELS=/other/location ollama serve gives an invalid argument when pulling the manifest (eventhough the change of the models directory works, it creates a blobs folder in the new location).
Thank you for your support.

@captaincurrie
Copy link

captaincurrie commented Jan 1, 2024

I'm running Archlinux and setting OLLAMA_MODELS has no effect. Still defaults to $HOME/.ollama/models

@aburlot
Copy link

aburlot commented Jan 4, 2024

I'm using ollama version 0.1.17 on Ubuntu 20.04. It was not working using an export OLLAMA_MODELS variable, but it worked well by putting it in the service as suggested here.
And I tested on another computer with Ubuntu 20.04 as well, and the OLLAMA_MODELS is well taken into account when installing the binary in my /path/to/local/bin.

@jikkujose
Copy link

@jikkuatwork are you running Ollama as a system service (ie: how it is installed by default using the linux install script)?

If so you'll need to add the environment variable to the system service and restart it. Here's what that looks like:

  1. Open the systemd service file to edit it.
sudo nano /etc/systemd/system/ollama.service
  1. Add the new environment variable.
[Service]
...
Environment="PATH=$PATH"
Environment="OLLAMA_MODELS=/path/to/models"
...
  1. Reload the systemd daemon.
sudo systemctl daemon-reload
  1. Restart the service.
sudo systemctl restart ollama

Any idea why this works and not the variable approach?

@aburlot
Copy link

aburlot commented Jan 7, 2024

It might be because systemctl does not run a shell with our user environment. So if you export it in your bashrc, systemctl has no knowledge of it. We could use EnvironmentFile= to specify it in a file and simply edit this file. It's well described here.

@ma3oun
Copy link

ma3oun commented Jan 9, 2024

It could be helpful to have a cache (with a configurable cache size in terms of number of models, or size on disk) for commonly used models. It is nice to store models on external storage but is better to avoid downloading them everytime I call the ollama server.

@sagaholdennoren
Copy link

On my system, adding the environment line did not work. Ollama simply did not start with that line added. But this did work:

sudo nano /etc/systemd/system/ollama.service.d

Add the line:
OLLAMA_MODELS=/path/to/models ollama serve

And now it works.

@nps798
Copy link

nps798 commented Feb 6, 2024

@sagaholdennoren What's your /etc/systemd/system/ollama.service right now? How do you set it ?

@sagaholdennoren
Copy link

sagaholdennoren commented Feb 7, 2024

@nps798: it is in the actual post how to set it in Linux. Please note the additional .d in /etc/systemd/system/ollama.service.d - the actual ollama.service file is unchanged.

Open terminal:
sudo nano /etc/systemd/system/ollama.service.d (or use editor you prefer instead of nano)

Now add the line (file is most probably empty): OLLAMA_MODELS=/path/to/models ollama serve
Save it with ctrl X, yes, enter.
Reboot.

I'm unaware how to do this on other systems like mac or win, but I'm sure you can ask ChatGPT 3.5.

@sagaholdennoren What's your /etc/systemd/system/ollama.service right now? How do you set it ?

@jikkuatwork
Copy link
Contributor

Is it necessary to install the executable via the script? Won't copy/pasting the binary work?

I am using the latest version (0.1.28) and I have set OLLAMA_MODELS=/path before the command as well as in /etc/systemd/system/ollama.service but when I try to pull something this is what I get:

➜ ollama pull <MODEL>
pulling manifest
Error: open /media/username/HDD_IDENTITY/ollama/cache/blobs/sha256:long-hash-partial-0: invalid argument

I have a hunch that this is somehow related with the media, the HDD is in NTFS. I suspect this because it had failed when I copied the ~/.ollama folder to the NTFS drive and tried to symlink.

It would be super helpful if any one can help. I can't test models because my home folder doesn't have the space to do it.

@kennethwork101
Copy link

I am trying to set the OLLAMA_MODELS to a directory in an external disk where I have space.
I am running on ubuntu 22.04
ollama version is 0.1.28

When I point OLLAMA_MODELS to a directory on external disk ollama failed to start.
Somehow ollama wants to mkdir on my disk mount point.
But it works when I put it in /tmp/myollama/models

sudo mkdir -p /media/kenneth/T9/myollama/models
sudo chown -R ollama:ollama /media/kenneth/T9/myollama/models

I put this line under [Service] in /etc/systemd/system/ollama.service
Environment="OLLAMA_MODELS=/media/kenneth/T9/myollama/models"

and run the following commands:
sudo systemctl daemon-reload
sudo systemctl restart ollama

ollama failed to start with the following error:
Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 65.
Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: Stopped Ollama Service.
Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: Started Ollama Service.
Mar 07 19:29:58 kenneth-MS-7E06 ollama[24445]: Error: mkdir /media/kenneth/T9: permission denied
Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: ollama.service: Failed with result 'exit-code'.

But using the directory under /tmp works.
Environment="OLLAMA_MODELS=/tmp/myollama/models"

sudo mkdir -p /tmp/myollama/models
sudo chown -R ollama:ollama /tmp/myollama/models
sudo systemctl daemon-reload
sudo systemctl restart ollama
ollama pull phi

How can I get it to work on external disk?
df -k /media/kenneth/T9
Filesystem 1K-blocks Used Available Use% Mounted on
/dev/sda2 3907001340 888595200 3018406140 23% /media/kenneth/T9

@frenchfaso
Copy link

I am trying to set the OLLAMA_MODELS to a directory in an external disk where I have space. I am running on ubuntu 22.04 ollama version is 0.1.28

When I point OLLAMA_MODELS to a directory on external disk ollama failed to start. Somehow ollama wants to mkdir on my disk mount point. But it works when I put it in /tmp/myollama/models

sudo mkdir -p /media/kenneth/T9/myollama/models sudo chown -R ollama:ollama /media/kenneth/T9/myollama/models

I put this line under [Service] in /etc/systemd/system/ollama.service Environment="OLLAMA_MODELS=/media/kenneth/T9/myollama/models"

and run the following commands: sudo systemctl daemon-reload sudo systemctl restart ollama

ollama failed to start with the following error: Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 65. Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: Stopped Ollama Service. Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: Started Ollama Service. Mar 07 19:29:58 kenneth-MS-7E06 ollama[24445]: Error: mkdir /media/kenneth/T9: permission denied Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE Mar 07 19:29:58 kenneth-MS-7E06 systemd[1]: ollama.service: Failed with result 'exit-code'.

But using the directory under /tmp works. Environment="OLLAMA_MODELS=/tmp/myollama/models"

sudo mkdir -p /tmp/myollama/models sudo chown -R ollama:ollama /tmp/myollama/models sudo systemctl daemon-reload sudo systemctl restart ollama ollama pull phi

How can I get it to work on external disk? df -k /media/kenneth/T9 Filesystem 1K-blocks Used Available Use% Mounted on /dev/sda2 3907001340 888595200 3018406140 23% /media/kenneth/T9

same issue here... trying to use a folder on a mounted usb disk fails, ollama tries to "mkdir" my mountpoint.

@xmsi
Copy link

xmsi commented Mar 24, 2024

Ollama does not work with environment variable, if I use systemctl, service by default.

ollama[20988]: Error: mkdir /media/user/old_ubuntu: permission denied

It tries to create folder.

However, if i use Ollama with serve in CLI and put chmod -R 777 permission to mounted drive models folder (in my case ollama_models, see command below), it works well.
CLI: $ OLLAMA_MODELS=/media/user/old_ubuntu/home/user/Documents/ollama_models ollama serve

@markkamp
Copy link

Ran into the same problem, this just seems a Linux permission error though.

When running in CLI, you run ollama with your own user account so you probably don't have no permission issues because the folders are probably owned by your own user account.
The service file however is run as user/group ollama:ollama. So if we change the OLLAMA_MODELS directory to say /home/user/Ollama not only should we give the folder Ollama to the user ollama, but also the parent directory should be accessible by the user ollama.
In this case the parent directory /home/user/ is owned by user:user. If we check the permissions (ls -all /home) we see drwxr-x---. So user ollama can't access this folder as permission setting for Others is not set. We can fix this by running (chmod o+x /home/user). This is though, a security risk for the /home/user folder as every user on the system can now execute.

In my case, I want the models in /home as this is on a different (larger) partition.
So I created a new /home/ollama folder and gave ownership to the ollama user:
sudo chown -R ollama:ollama /home/ollama

Added the model path to the service file.
sudo systemctl edit --full ollama.service
Add under [Service]

Environment="OLLAMA_MODELS=/home/ollama/Ollama"

And restarted the service:
sudo systemctl restart ollama.service

As the ollama user account has full control over the (sub)folder, no errors should be given:
sudo systemctl status ollama.service

@saratonite
Copy link

I just created a symbolic link
ln -s /path/you/want ~/.ollama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet