Support Ollamas API for the local provider option
rsolvang opened this issue · comments
Is your feature request related to a problem? Please describe.
No.
Describe the solution you'd like
I would like to connect to my local Ollama API to interface with the models I have downloaded.
Describe alternatives you've considered
I use oterm TUI at the moment, but it would be nice to have a native GNOME app to interface with Ollama.
https://github.com/ollama/ollama/blob/main/docs/openai.md. OLlama support openai, you can use it with local providers