There are 55 repositories under knowledge-distillation topic.
A treasure chest for visual classification and recognition powered by PaddlePaddle
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Collection of AWESOME vision-language models for vision tasks
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
SOTA low-bit LLM quantization (INT8/FP8/MXFP8/INT4/MXFP4/NVFP4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
A curated list for Efficient Large Language Models
Pytorch implementation of various Knowledge Distillation (KD) methods.
OpenMMLab Model Compression Toolbox and Benchmark.
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
AI Powered Knowledge Graph Generator
Efficient computing methods developed by Huawei Noah's Ark Lab
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
Collection of recent methods on (deep) neural network compression and acceleration.
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
knowledge distillation papers
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Segmind Distilled diffusion
[ICCV 2023] MI-GAN: A Simple Baseline for Image Inpainting on Mobile Devices
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Code and resources on scalable and efficient Graph Neural Networks (TNNLS 2023)
2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) :fire:
Infrastructures™ for Machine Learning Training/Inference in Production.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Official pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419
(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)