Local AI providers
Azarattum opened this issue · comments
It would be nice if gitlens's AI features could integrate with LLMs running locally. For example via ollama. Not everybody can use the cloud for one reason or another.
That would definitely be a key feature. Lots of companies do not want to send code off to a cloud LLM. I use Twinny in VScode to interface with a local Ollama.