Add support for Mistral-7B in the Completions API
yixu34 opened this issue · comments
Feature Request
What is the problem you're currently running into?
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Add Mistral-7B to LLM Engine.
Why do you want this feature?
A clear and concise description of why you want the feature.
https://mistral.ai/news/announcing-mistral-7b/
Describe the solution you'd like
A clear and concise description of what you want to happen.
Mistral-7B should appear in the model zoo.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
N/A
Additional context
Add any other context or screenshots about the feature request here.
Prioritization
-
Does this feature block you from using the project?
- Yes
- No
-
How many users will benefit from this feature?
- Just me
- Few people might benefit
- Many users will love it!
-
Complexity
- I believe it's a simple feature to implement
- It might require some effort to implement
- It's probably complex, and might take significant effort
Thank you for your contribution to llm-engine
. Please ensure you've given the feature considerable thought before submitting it. Once your feature request is accepted, and you're interested in building it, please mention it so that the maintainers can guide you!
Just need a vLLM version bump, see vllm-project/vllm#1199
This is now done. Users can send Completion calls to mistral-7b and mistral-7b-instruct. See #307 for implementation!