Unable to access Ollama models
arsaboo opened this issue · comments
I managed to get everything running, but it is not able to access the local models. Even after I enable local models in the UI:
Nothing is returned. I also don't see any activity in Ollama logs. Here's my .env
OPENAI_API_KEY=KEY
TAVILY_API_KEY=KEY
ENABLE_LOCAL_MODELS=True
OLLAMA_HOST=http://host.docker.internal:11434
I don't see anything else in the logs. In the browser console, I see:
Here's my compose file (had to update the ports):
services:
backend:
build:
context: .
dockerfile: ./src/backend/Dockerfile.dev
ports:
- "8003:8000"
environment:
- OLLAMA_HOST=${OLLAMA_HOST:-http://host.docker.internal:11434}
- TAVILY_API_KEY=${TAVILY_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- GROQ_API_KEY=${GROQ_API_KEY}
- ENABLE_LOCAL_MODELS=${ENABLE_LOCAL_MODELS:-True}
env_file:
- .env
develop:
watch:
- action: sync
path: ./src/backend
target: /workspace/src/backend
extra_hosts:
- "host.docker.internal:host-gateway"
frontend:
depends_on:
- backend
build:
context: .
dockerfile: ./src/frontend/Dockerfile.dev
environment:
- NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL:-http://localhost:8003}
- NEXT_PUBLIC_LOCAL_MODE_ENABLED=${NEXT_PUBLIC_LOCAL_MODE_ENABLED:-true}
ports:
- "3013:3000"
develop:
watch:
- action: sync
path: ./src/frontend
target: /app
ignore:
- node_modules/
Hey Alok, when you visit http://localhost:11434/, do you see "Ollama is running"?
Figured out:
NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL:-http://localhost:8003}
is binding to localhost. I changed it to the private IP of the device: NEXT_PUBLIC_API_URL=${NEXT_PUBLIC_API_URL:-http://192.168.2.162:8003}
and it works now.