Giters
NetEase-FuXi
/
EETQ
Easy and Efficient Quantization for Transformers
Geek Repo:
Geek Repo
Github PK Tool:
Github PK Tool
Stargazers:
163
Watchers:
6
Issues:
17
Forks:
13
NetEase-FuXi/EETQ Issues
Repetition with Llama3-70b and EETQ
Updated
2 months ago
Comments count
1
How to handle bfloat16?
Closed
7 months ago
Comments count
7
Does it support Vision Transformers?
Updated
2 months ago
Comments count
1
Integration with Hugging Face transformers library
Closed
2 months ago
Comments count
2
Support CPU quantization
Updated
2 months ago
Comments count
3
License
Closed
2 months ago
Comments count
1
Qlora with eetq is quite slow
Updated
3 months ago
Comments count
3
how to dequant a EETQ model?
Closed
3 months ago
Comments count
4
Quantization takes a very long time
Updated
4 months ago
Comments count
3
Supports H100
Updated
4 months ago
Comments count
1
Understanding EETQ and 8 bit quantization
Closed
7 months ago
Comments count
3
Question on outlier handling
Closed
7 months ago
Comments count
1
Why does EETQ take up all VRAM
Closed
8 months ago
Comments count
2
安装出错ERROR: Could not build wheels for EETQ, which is required to install pyproject.toml-based projects
Closed
10 months ago
Comments count
5