madox2 / vim-ai

AI-powered code assistant for Vim. OpenAI and ChatGPT plugin for Vim and Neovim.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Custom API llama-cpp-python complains about missing OpenAI API Key

oxr463 opened this issue · comments

I followed the instructions here, https://github.com/abetlen/llama-cpp-python/tree/main/docker#open-llama-in-a-box, and I can see it running on port 8000 but when I tried following these steps, https://github.com/madox2/vim-ai/wiki/Custom-APIs#llama-cpp-python, I get the following error:

Missing OpenAI API key

Hi, recently has been added new config option enable_auth. I believe that setting it to 0 would resolve your issue

That config option is already added.

Well, that option is supposed to disable api key check, if not then it's a bug. Could you share your configuration?

It's literally just this:

cat ~/.vimrc
let g:vim_ai_chat = {
\  "options": {
\    "endpoint_url": "http://localhost:8000/v1/chat/completions",
\    "enable_auth": 0,
\  },
\}

I have checked once again and it is working as expected in my environment. Note that your configuration is related just to AIChat command. AI and AIEdit commands need to be configured separately. Please also verify that you are using latest version of the plugin

Do vim_ai_complete and vim_ai_edit work with those options for a custom endpoint work too ? I got AIChat to work with ollama but not the others two ... (maybe ollama is missing some endpoints but i can't say which you requiere from the docs).

Do vim_ai_complete and vim_ai_edit work with those options for a custom endpoint work too ? I got AIChat to work with ollama but not the others two ... (maybe ollama is missing some endpoints but i can't say which you requiere from the docs).

If ollama does not support completions, you can configure AI, AIEdit with chat engine

Closing as this works as intended