Vllm completions/chat api connectivity [Roadmap]
Playerrrrr opened this issue · comments
Why
i like running localhost port local llms via VLLM, would be awesome if I had an UI to interact and continue my work as llm dev on big-agi, instead of running my own custom python scripts. If there is a way to connect them, please enlighten me, thank you in advance!
Can you explain me more, I don't understand the workflow. Have you tried using a generic "OpenAI" model vendor, with advanced options to set host and port?