Feature Request: Ollama Support
orkutmuratyilmaz opened this issue · comments
Hello and thanks for this beautiful repository,
Do you have plans to support open source LLMs with the tools like Ollama?
Best,
Orkut
We are considering integrating more open-source models into OS-Copilot, and Ollama is a great tool that can help us achieve this goal quickly. Thank you for your suggestion~
Is Ollama support planned at some point? Or is this just that it's a good idea in general? Would you accept a PR?
Is Ollama support planned at some point? Or is this just that it's a good idea in general? Would you accept a PR?
Hi @jdonaldson A PR is definitely appreciated! ! You can join the discord and create a thread to discuss on this PR.
At this moment, we only managed to support llama3 through Ollama, in a separate manner (i.e., using Ollama to serve models in the backend). See #22