There are 31 repositories under knowledge-distillation topic.
A treasure chest for visual classification and recognition powered by PaddlePaddle
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Pytorch implementation of various Knowledge Distillation (KD) methods.
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
OpenMMLab Model Compression Toolbox and Benchmark.
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Collection of recent methods on (deep) neural network compression and acceleration.
knowledge distillation papers
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Code and resources on scalable and efficient Graph Neural Networks
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Official pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)
Pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Infrastructures™ for Machine Learning Training/Inference in Production.
2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) :fire:
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch
A large scale study of Knowledge Distillation.
FasterAI: Prune and Distill your models with FastAI and PyTorch