Support local LLM
sandangel opened this issue · comments
Is it possible to add support for local LLM, using OpenAI API compatible server or Ollama?
I'll look into this -- I'm in the middle of a large rewrite of how we handle chat. I know the VS Code team has been exploring this as a possibility, so it does seem like something we could support (someday, sorry, no timelines at the moment)
That is great to hear. I really love it. Thank you a lot. I also watch you on youtube sometimes. THanks for making great content about neovim. :)