ollama / ollama

Get up and running with Llama 3.2, Mistral, Gemma 2, and other large language models.

Home Page:https://ollama.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

`OLLAMA_MODELS` environment variable is not respected

JLCarveth opened this issue · comments

What is the issue?

I have followed the steps here to change where Ollama stores the downloaded models.

I make sure to run systemctl daemon-reload and to restart the ollama service, and yet it is still storing the model blobs in /usr/share/ollama/... instead of the location specified in OLLAMA_MODELS.

I expect Ollama to download the models to the specified location. I have insufficient space left on my root partition, hence why I am trying to download the models to my home directory instead.

sudo systemctl edit ollama.service:

### Anything between here and the comment below will become the contents of the drop-in file

Environment="OLLAMA_MODELS=/home/jlcarveth/.ollama/models"

OS

Linux

GPU

AMD

CPU

Intel

Ollama version

0.1.32

Your drop in files is missing a [Service] header so systemd doesn't know what section the configuration belongs in.

This should be the full contents of the configuration file:

[Service]
Environment="OLLAMA_MODELS=/home/jlcarveth/.ollama/models"

The FAQ example uses OLLAMA_HOST but the idea is the same