wangcx18 / llm-vscode-inference-server

An endpoint server for efficiently serving quantized open-source LLMs for code.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

wangcx18/llm-vscode-inference-server Stargazers