mit-han-lab / llm-awq

[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration

Repository from Github https://github.commit-han-lab/llm-awqRepository from Github https://github.commit-han-lab/llm-awq

mit-han-lab/llm-awq Watchers