`[LLM] Model bigcode/starcoderbase is currently loading`
bogdan-the-great opened this issue · comments
bogdan-the-great commented
I'm getting [LLM] Model bigcode/starcoderbase is currently loading
when using the bigcode/starcoderbase
with this config on lazy.nvim:
"huggingface/llm.nvim",
event = "VeryLazy",
opts = {
api_token = "<key>",
{
tokens_to_clear = { "<|endoftext|>" },
fim = {
enabled = true,
prefix = "<fim_prefix>",
middle = "<fim_middle>",
suffix = "<fim_suffix>",
},
model = "bigcode/starcoder",
context_window = 8192,
tokenizer = {
repository = "bigcode/starcoder",
}
}
}
Luc Georges commented
This can happen when the model has not been queried in a long time. If you really want starcoderbase
and are not able to get it to run on the API Inference then I would suggest to host the model yourself.
Otherwise, you can switch to bigcode/starcoder
which should give some good results regardless of the language.
bogdan-the-great commented
I changed the model and it works, thanks.
Andres Monge commented
Given all the models, is there a list of the best ones per language?
Luc Georges commented
I haven't curated such a list sadly, but it'd be a good idea to make one!