There are 46 repositories under knowledge-distillation topic.
A treasure chest for visual classification and recognition powered by PaddlePaddle
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
Collection of AWESOME vision-language models for vision tasks
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Pytorch implementation of various Knowledge Distillation (KD) methods.
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
OpenMMLab Model Compression Toolbox and Benchmark.
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Efficient computing methods developed by Huawei Noah's Ark Lab
Collection of recent methods on (deep) neural network compression and acceleration.
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
A curated list for Efficient Large Language Models
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
knowledge distillation papers
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Code and resources on scalable and efficient Graph Neural Networks
Segmind Distilled diffusion
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Official pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
[ICCV 2023] MI-GAN: A Simple Baseline for Image Inpainting on Mobile Devices
Infrastructures™ for Machine Learning Training/Inference in Production.
Pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419