A simple deployment package to run vLLM inference server on UbiOps
Geek Repo:Geek Repo
Github PK Tool:Github PK Tool