bentoml / OpenLLM

Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.

Home Page:https://bentoml.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to update the prompt template without change openllm-core config

hahmad2008 opened this issue · comments

Describe the bug

I need to change the system message of the prompt template for one of the models in openllm, I see that we have this doc to be followed: https://github.com/bentoml/OpenLLM/blob/main/ADDING_NEW_MODEL.md

however I need to update it from openllm command as I installed openllm from pip install not from a local rep.

Could you please help me? or let me know how to install openllm from a local repo?

Thanks

To reproduce

No response

Logs

No response

Environment

System information (Optional)

No response