olimorris / codecompanion.nvim

✨ A Copilot Chat experience in Neovim. Supports Anthropic, Ollama and OpenAI LLMs

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug]: ollama adapter not working with new config

cleong14 opened this issue · comments

commented

Your minimal.lua config

local root = vim.fn.fnamemodify("./.repro", ":p")

-- set stdpaths to use .repro
for _, name in ipairs({ "config", "data", "state", "cache" }) do
  vim.env[("XDG_%s_HOME"):format(name:upper())] = root .. "/" .. name
end

-- bootstrap lazy
local lazypath = root .. "/plugins/lazy.nvim"
if not vim.loop.fs_stat(lazypath) then
  vim.fn.system({
    "git",
    "clone",
    "--filter=blob:none",
    "--single-branch",
    "https://github.com/folke/lazy.nvim.git",
    lazypath,
  })
end
vim.opt.runtimepath:prepend(lazypath)

-- install plugins
local plugins = {
  {
    "olimorris/codecompanion.nvim",
    dependencies = {
      { "nvim-treesitter/nvim-treesitter", build = ":TSUpdate"},
      "nvim-lua/plenary.nvim",
      {
        "stevearc/dressing.nvim", -- Optional: Improves the default Neovim UI
        opts = {},
      },
    },
    config = true
  }
}

require("lazy").setup(plugins, {
  root = root .. "/plugins",
})

-- setup treesitter
local ok, treesitter = pcall(require, "nvim-treesitter.configs")
if ok then
  treesitter.setup({
    ensure_installed = "all",
    ignore_install = { "phpdoc" }, -- list of parser which cause issues or crashes
    highlight = { enable = true },
  })
end

Error messages

Error: Error malformed json: Expected value but found T_END at character 1
Error: Error malformed json: Expected value but found invalid token at character 1

Health check output

codecompanion: require("codecompanion.health").check()

codecompanion.nvim report
- Log file: /Users/chazleong/.config/nvim/.repro/state/nvim/codecompanion.log
- OK nvim-treesitter installed
- OK plenary.nvim installed
- OK dressing.nvim installed
- WARNING edgy.nvim not found
- OK curl installed

Log output

<EMPTY>

Describe the bug

Expected Results

Ollama adapter throws no error and CodeCompanion works as usual.

Actual Results

Ollama adapter throws error and CodeCompanion + Ollama is currently broken.

Reproduce the bug

Steps to Reproduce

  1. Use latest version of codecompanion.nvim
  2. Use minimal.lua config above
  3. Run nvim --clean -u minimal.lua
  4. Run :CodeCompanionChat ollama
  5. Observe error

Final checks

  • I have made sure this issue exists in the latest version of the plugin
  • I have tested with the minimal.lua config file above and still get the issue
commented

@olimorris Apologies, I tried to figure out the issue/fix on my own but I couldn't quite pinpoint what exactly was wrong with the Ollama adapter and I don't use the OpenAI or Anthropic adapters so I have no good point of reference as far as working/not working code.

commented

Hey @cleong14 you might need to reference #9 (comment).

I'd advise subscribing to that issue as I expect the APIs to change quite a lot.

commented

I tried to adopt the new config but the ollama adapter still didn't work for me.

My config looked something like this:

-- default config
require("codecompanion").setup({
  adapters = {
    anthropic = require("codecompanion.adapters").use("anthropic"),
    ollama = require("codecompanion.adapters").use("ollama"),
    openai = require("codecompanion.adapters").use("openai"),
  },
  strategies = {
    chat = "ollama",
    inline = "ollama",
  },
})

Running :CodeCompanionChat ollama brings up the chat buffer but when I try to actually save (e.g. <C-s>) the chat buffer it errors out.

commented

Just to check you have done ollama serve and have the deepseek-coder:6.7b model installed or are using a different model?

commented

Confirmed that I ran ollama serve beforehand/the instance is currently running and confirmed I that I hit the error when using deepseek-coder:6.7b. FWIW, I did try some of the other models I have and they all error too.

Ugh. I just retested everything and the minimal.lua works now, but my configs are breaking for some reason. I'm not sure where but I guess it's something I'm doing on my end.

One last question before closing this as it's not a bug, do the new configs affect the ollama adapter and the ability of adding additional default parameters?

For context, this is the error I get when I run :CodeCompanionChat ollama with my configs:

E5108: Error executing lua: vim/_editor.lua:0: nvim_exec2()..BufWriteCmd Autocommands for "<buffer=73>": Vim(append):Error executing lua callback: ...codecompanion.nvim/lua/codecompanion/strategies/chat.lua:539: attempt to call method 'set_params' (a nil value)
stack traceback:
	...codecompanion.nvim/lua/codecompanion/strategies/chat.lua:539: in function 'submit'
	...codecompanion.nvim/lua/codecompanion/strategies/chat.lua:201: in function <...codecompanion.nvim/lua/codecompanion/strategies/chat.lua:196>
	[C]: in function 'nvim_exec2'
	vim/_editor.lua: in function 'cmd'
	...im/lazy/codecompanion.nvim/lua/codecompanion/keymaps.lua:9: in function 'rhs'
	...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:43: in function <...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:42>
stack traceback:
	[C]: in function 'nvim_exec2'
	vim/_editor.lua: in function 'cmd'
	...im/lazy/codecompanion.nvim/lua/codecompanion/keymaps.lua:9: in function 'rhs'
	...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:43: in function <...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:42>
commented

Your minimal config should be amended to specifically use ollama. I suspect you've done that but wanted to be explicit.

local plugins = {
  {
    "olimorris/codecompanion.nvim",
    dependencies = {
      { "nvim-treesitter/nvim-treesitter", build = ":TSUpdate"},
      "nvim-lua/plenary.nvim",
      {
        "stevearc/dressing.nvim", -- Optional: Improves the default Neovim UI
        opts = {},
      },
    },
    config = function()
      require("codecompanion").setup({
        strategies = {
          chat = "ollama",
          inline = "ollama",
        }
     })
    end,
  }
}

One last question before closing this as it's not a bug, do the new configs affect the ollama adapter and the ability of adding additional default parameters?

They shouldn't do...but I haven't tested fully since the update. This may need to be a seperate issue.

commented

Retested with the above minimal.lua and can confirm that it works on my end.

Hmm. Definitely something on my end and related to my fork. The only thing I can think of that's a pretty big difference is that I'm adding additional default params.

Anyway, I appreciate your help and time in troubleshooting the issue with me. I'll go ahead and close the issue now as it is definitely an issue on my end.

Thanks again, @olimorris !

commented

My pleasure. Let me know if the additional default params aren't working as intended by opening up a new issue.

commented

You might have been on to something yesterday. I noticed that the OpenAI and Anthropic adapters weren't working well with custom overrides.

Let me know if you notice anything else.