bentoml / OpenLLM

Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.

Home Page:https://bentoml.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to modify the port number?

fawpcmhgung162 opened this issue · comments

After openllm start is executed, the default is http://localhost:3000. How to modify it?