There are 26 repositories under knowledge-distillation topic.
A treasure chest for visual recognition powered by PaddlePaddle
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Pytorch implementation of various Knowledge Distillation (KD) methods.
PaddleSlim is an open-source library for deep model compression and architecture search.
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Collection of recent methods on (deep) neural network compression and acceleration.
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
OpenMMLab Model Compression Toolbox and Benchmark.
knowledge distillation papers
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Official pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)
Pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
Infrastructures™ for Machine Learning Training/Inference in Production.
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
A large scale study of Knowledge Distillation.
A Knowledge Distillation Toolbox. The official implementation of https://arxiv.org/abs/2203.08679
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch
[CVPR'20] PyTorch code for our paper "Collaborative Distillation for Ultra-Resolution Universal Style Transfer"
FasterAI: Prune and Distill your models with FastAI and PyTorch
Paper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.