skepsl / DistillationTraining

Distillation Knowledge for training Multi-exit Model

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DistillationTraining

Distillation Knowledge for training Multi-exit Model We implement a Distillation knowledge to train a Multi-Exit (ME) Model ResNet50. Our ME consists of 4 early exit gates from each residual block. Using this schenario, we yield 82.5% 85% 89% 92% accuracy from gate 1 to 4 respectively. To reproduce the result, run the main.py script.

alt text

Reference:

Distillation-Based Training for Multi-Exit Architectures

About

Distillation Knowledge for training Multi-exit Model


Languages

Language:Python 100.0%