Support ollama model API
acwwt opened this issue · comments
希望可以支持ollama运行的本地运行的模型,方便我们可以在本地运行。
Thanks for your suggestions! We are planning to support local models with llama.cpp, pay close attention to please!
If I don't have a +86 phone number, then I cannot join this "https://www.modelscope.cn/home", like a Discord Group, right?