Wang, Chang's repositories
bigcode-evaluation-harness
A framework for the evaluation of autoregressive code generation language models.
CS-Awesome-Courses
计算机的优秀课程
CUDA-Programming-Guide-in-Chinese
This is a Chinese translation of the CUDA programming guide
intel-extension-for-transformers
Extending Hugging Face transformers APIs for Transformer-based models and improve the productivity of inference deployment. With extremely compressed models, the toolkit can greatly improve the inference efficiency on Intel platforms.
lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
optimum
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
optimum-intel
Accelerate inference of 🤗 Transformers with Intel optimization tools
transformers
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
lpot
Intel® Low Precision Optimization Tool, targeting to provide a unified low precision inference interface cross different deep learning frameworks, and support auto-tune with specified accuracy criterion to find out best quantized model.