mistralai / mistral-inference

Official inference library for Mistral models

Home Page:https://mistral.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TinyMistral? small llm for phones and computers with no gpu?

agonzalezm opened this issue · comments

Hi, is there any plan to release a good performance small model 1B/2B/3B like TinyLlama, phi-2, etc

Most people want to run open source llms on local for specific tasks but have no gpu, having these smalls models that can inference fast enough with low resources ( smartphones, no gpu, etc ) has lot of demand and can be use for finetuning specific tasks.

Thanks