[Bug]: Local Ollama models do not work in Windows when OLLMA_HOST is set to 0.0.0.0
ksylvan opened this issue · comments
Kayvan Sylvan commented
What happened?
I set OLLAMA_HOST to 0.0.0.0
so when ollama server starts, it binds to localhost and to the WSL interface.
This works correctly and I can see the Ollama server:
PS C:\Users\kayvan> Invoke-WebRequest -Uri http://localhost:11434
StatusCode : 200
StatusDescription : OK
Content : Ollama is running
RawContent : HTTP/1.1 200 OK
Date: Sun, 31 Mar 2024 03:00:54 GMT
Content-Type: text/plain; charset=utf-8
Content-Length: 17
Ollama is running
And in WSL:
kayvan@SHAKTI:/mnt/c/Users/kayvan$ ip route
default via 172.23.128.1 dev eth0
172.23.128.0/20 dev eth0 proto kernel scope link src 172.23.128.141
kayvan@SHAKTI:/mnt/c/Users/kayvan$ curl http://172.23.128.1:11434; echo ""
Ollama is running
fabric --listmodels
fails to list local models.
See the discussion in #271
Version check
- Yes I was.
Relevant log output
No response
Relevant screenshots (optional)
No response
Kayvan Sylvan commented
I have a quick PR. Please assign this to me @danielmiessler
Kayvan Sylvan commented
The above PR fixes this.
Daniel Miessler commented
Added, thank you!
Kayvan Sylvan commented
Hi @danielmiessler this issue isn't closed till PR #315 is merged.