There are 6 repositories under distillation topic.
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A PyTorch-based knowledge distillation toolkit for natural language processing
PaddleSlim is an open-source library for deep model compression and architecture search.
Pytorch implementation of various Knowledge Distillation (KD) methods.
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Segmind Distilled diffusion
irresponsible innovation. Try now at https://chat.dev/
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
Papers and Book to look at when starting AGI 📚
Insightface Keras implementation
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
A Compressed Stable Diffusion for Efficient Text-to-Image Generation [ICCV'23 Demo] [ICML'23 Workshop]
[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model
Filter Grafting for Deep Neural Networks(CVPR 2020)
This repository contains the implementation of three adversarial example attack methods FGSM, IFGSM, MI-FGSM and one Distillation as defense against all attacks using MNIST dataset.
Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
A list of papers, docs, codes about efficient AIGC. This repo is aimed to provide the info for efficient AIGC research, including language and vision, we are continuously improving the project. Welcome to PR the works (papers, repositories) that are missed by the repo.