Installing llm-mistral but not setting an api key breaks llm models list
recollir opened this issue · comments
Installing llm-mistral but not setting an api key for it breaks llm models list
. Not sure if this is due to how mistral works. Or if this should be reported to the llm repository. A workaround could be to ignore mistral when no api key set and just issue a warning for this when listing models.
❯ llm keys
No keys found
❯ llm plugins
[]
❯ llm models
OpenAI Chat: gpt-3.5-turbo (aliases: 3.5, chatgpt)
OpenAI Chat: gpt-3.5-turbo-16k (aliases: chatgpt-16k, 3.5-16k)
OpenAI Chat: gpt-4 (aliases: 4, gpt4)
OpenAI Chat: gpt-4-32k (aliases: 4-32k)
OpenAI Chat: gpt-4-1106-preview
OpenAI Chat: gpt-4-0125-preview
OpenAI Chat: gpt-4-turbo-preview (aliases: gpt-4-turbo, 4-turbo, 4t)
OpenAI Completion: gpt-3.5-turbo-instruct (aliases: 3.5-instruct, chatgpt-instruct)
❯ llm install llm-mistral
Collecting llm-mistral
...
...
Using cached llm_mistral-0.3-py3-none-any.whl (9.5 kB)
Installing collected packages: llm-mistral
Successfully installed llm-mistral-0.3
❯ llm models
Error: You must set the 'mistral' key or the LLM_MISTRAL_KEY environment variable.
Thanks, yeah this is a bug.
I tested this manually by deleting ~/Library/Application\ Support/io.datasette.llm/mistral_models.json
and then removing my mistral
key from the keys.json
file.
🙏 Thank you.