EricLBuehler / candle-vllm

Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EricLBuehler/candle-vllm Issues