[Question] Is it possible to use Ollama as a backend?
curldivergence opened this issue · comments
Hi, first of all - thank you for an awesome project :)
I have a question though - the docs say that when llama.cpp
backend is used, the LSP links directly to it, but I'm not sure - is it possible to utilize an Ollama instance serving on the local network?
Thanks!
This sounds like it would be a great addition. I will add Ollama support today. Stay tuned!
Added Ollama here! #11 Pushing out a new release now