Can i use vim-ai chat with local ollama models ?
DantesLin opened this issue · comments
Hi, Can i use vim-ai chat with local ollama models ? if can , how to config?
Hi, check my config file here.
I have added instructions for Ollama in the wiki : https://github.com/madox2/vim-ai/wiki/Custom-APIs#ollama