jllllll / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

jllllll/exllamav2 Stargazers