[FEAT] Suport Ollama
airtonix opened this issue · comments
Is your feature request related to a problem? Please describe.
Would like to be able to use Ollama instead of openai
We're unlikely to support Ollama. You can however use local LLMs with any inference server that provides an OpenAI-compatible API, such as LocalAI.