Bavarder / Bavarder

Chit-chat with an AI

Home Page:https://bavarder.codeberg.page

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support Ollamas API for the local provider option

rsolvang opened this issue · comments

Is your feature request related to a problem? Please describe.
No.

Describe the solution you'd like
I would like to connect to my local Ollama API to interface with the models I have downloaded.

Describe alternatives you've considered
I use oterm TUI at the moment, but it would be nice to have a native GNOME app to interface with Ollama.

https://github.com/ollama/ollama/blob/main/docs/openai.md. OLlama support openai, you can use it with local providers