XinQIangYU / Knowledge_Distillation_Papers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Knowledge_Distillation_Papers

This are knowledge distillation papers.

CVPR2023

  1. DaFKD : Domain-aware Federated Knowledge Distillation
  2. Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation
  3. DisWOT: Student Architecture Search for Distillation WithOut Training
  4. Generic-to-Specific Distillation of Masked Autoencoders
  5. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
  6. Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation

Big Model

  1. Lion: Adversarial Distillation of Closed-Source Large Language Model
  2. Distilling Step-by-Step Outperforming Larger Language Models with Less Training Data and Smaller Model Sizes

Diffusion Model

  1. On Distillation of Guided Diffusion Models

Others

  1. VanillaKD : Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale

About