danielmiessler / fabric

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.

Home Page:https://danielmiessler.com/p/fabric-origin-story

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug]: Local Ollama models do not work in Windows when OLLMA_HOST is set to 0.0.0.0

ksylvan opened this issue · comments

What happened?

I set OLLAMA_HOST to 0.0.0.0 so when ollama server starts, it binds to localhost and to the WSL interface.

This works correctly and I can see the Ollama server:

PS C:\Users\kayvan> Invoke-WebRequest -Uri http://localhost:11434

StatusCode        : 200
StatusDescription : OK
Content           : Ollama is running
RawContent        : HTTP/1.1 200 OK
                    Date: Sun, 31 Mar 2024 03:00:54 GMT
                    Content-Type: text/plain; charset=utf-8
                    Content-Length: 17

                    Ollama is running

And in WSL:

kayvan@SHAKTI:/mnt/c/Users/kayvan$ ip route
default via 172.23.128.1 dev eth0
172.23.128.0/20 dev eth0 proto kernel scope link src 172.23.128.141
kayvan@SHAKTI:/mnt/c/Users/kayvan$ curl http://172.23.128.1:11434; echo ""
Ollama is running

fabric --listmodels fails to list local models.

See the discussion in #271

Version check

  • Yes I was.

Relevant log output

No response

Relevant screenshots (optional)

No response

I have a quick PR. Please assign this to me @danielmiessler

The above PR fixes this.

Added, thank you!

Hi @danielmiessler this issue isn't closed till PR #315 is merged.