oobabooga / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs

Repository from Github https://github.comoobabooga/exllamav2Repository from Github https://github.comoobabooga/exllamav2

oobabooga/exllamav2 Stargazers