bobo0810 / RepDistiller

RepDistiller-知识蒸馏库注释

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

RepDistiller

收录到PytorchNetHub

说明

  • 2021.2 注释支持KD、FSP
  • 仅注释, 方便移植。运行、查看蒸馏指标,请访问官方库
  • 源码入口 train_student.py
方法 论文 备注 注释
KD Distilling the Knowledge in a Neural Network
FitNet Fitnets: hints for thin deep nets
AT Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer
SP Similarity-Preserving Knowledge Distillation
CC Correlation Congruence for Knowledge Distillation
VID Variational Information Distillation for Knowledge Transfer
RKD Relational Knowledge Distillation
PKT Probabilistic Knowledge Transfer for deep representation learning
AB Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons
FT Paraphrasing Complex Network: Network Compression via Factor Transfer
FSP A Gift from Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning 教师、学生网络对应的中间特征,其通道数必须相同
NST Like what you like: knowledge distill via neuron selectivity transfer
CRD Contrastive Representation Distillation ICLR 2020

参考

RepDistiller

About

RepDistiller-知识蒸馏库注释

License:BSD 2-Clause "Simplified" License


Languages

Language:Python 97.5%Language:Shell 2.5%