enricoros / big-AGI

Generative AI suite powered by state-of-the-art models and providing advanced AI/AGI functions. It features AI personas, AGI functions, multi-model chats, text-to-image, voice, response streaming, code highlighting and execution, PDF import, presets for developers, much more. Deploy on-prem or in the cloud.

Home Page:https://big-agi.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Vllm completions/chat api connectivity [Roadmap]

Playerrrrr opened this issue · comments

commented

Why
i like running localhost port local llms via VLLM, would be awesome if I had an UI to interact and continue my work as llm dev on big-agi, instead of running my own custom python scripts. If there is a way to connect them, please enlighten me, thank you in advance!

Can you explain me more, I don't understand the workflow. Have you tried using a generic "OpenAI" model vendor, with advanced options to set host and port?