ollama / ollama

Get up and running with Llama 3, Mistral, Gemma, and other large language models.

Home Page:https://ollama.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

It is possible to enable OpenAI Api in Docker image

Tomichi opened this issue · comments

Hi,
I want ask you if it is possible to enable openAI api compatibility to official OIlama Docker image. I try that feature works in desktop app well and it's missing in docker image. https://ollama.com/blog/openai-compatibility Desktop app works well.

Thank you for anybody helps.

Hi @Tomichi yes, it should work out of the box. Make certain you have the newest version of ollama by doing a docker pull ollama/ollama.

You can then start ollama in docker with:

docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama. Follow the directions in the documentation if you want to use the GPU. Make sure there isn't a port conflict with any locally running copy of ollama (it defaults to port 11434).

You can then pull a model to ollama and use it:

% ollama pull orca-mini
pulling manifest
pulling 66002b78c70a... 100% ▕███████████████████████████████████████████████████████████
pulling dd90d0f2b7ee... 100% ▕███████████████████████████████████████████████████████████
pulling 93ca9b3d83dc... 100% ▕███████████████████████████████████████████████████████████
pulling 33eb43a1488d... 100% ▕███████████████████████████████████████████████████████████
pulling fd52b10ee3ee... 100% ▕███████████████████████████████████████████████████████████
verifying sha256 digest
writing manifest
removing any unused layers
success
% curl localhost:11434/v1/chat/completions -H "Content-Type: application/json" \
-d '{
        "model": "orca-mini",
        "messages": [
            {
                "role": "system",
                "content": "You are a helpful assistant."
            },
            {
                "role": "user",
                "content": "Hello!"
            }
        ]
    }' | jq

Hopefully this helps! I'll close out the issue.

Thank you, yes i have the school type mistake, that I havent got newest version of docker, and due this reason i got 404 error when I send response to docker ollama. It works fine. Thank you @pdevine. It works well.