huggingface / llm.nvim

LLM powered development for Neovim

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

`[LLM] Model bigcode/starcoderbase is currently loading`

bogdan-the-great opened this issue · comments

I'm getting [LLM] Model bigcode/starcoderbase is currently loading when using the bigcode/starcoderbase with this config on lazy.nvim:

"huggingface/llm.nvim",
event = "VeryLazy",
opts = {
    api_token = "<key>",
    {
      tokens_to_clear = { "<|endoftext|>" },
      fim = {
        enabled = true,
        prefix = "<fim_prefix>",
        middle = "<fim_middle>",
        suffix = "<fim_suffix>",
      },
      model = "bigcode/starcoder",
      context_window = 8192,
      tokenizer = {
        repository = "bigcode/starcoder",
      }
    }
}

This can happen when the model has not been queried in a long time. If you really want starcoderbase and are not able to get it to run on the API Inference then I would suggest to host the model yourself.
Otherwise, you can switch to bigcode/starcoder which should give some good results regardless of the language.

I changed the model and it works, thanks.

Given all the models, is there a list of the best ones per language?

I haven't curated such a list sadly, but it'd be a good idea to make one!