ArchipLab-LinfengZhang / Task-Oriented-Feature-Distillation

This is the implementaion of paper "Task-Oriented Feature Distillation"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Task-Oriented Feature Distillation

This is the implementation of Task-Oriented Feature Distillation, NeurIPS2020.

Experiments on CIFAR100

Step1. Install the required packages.

pip install torch torchvision

Step2. Train a student model.

python distill.py --model=resnet18

Note that you can choose resnet, senet and preactresnet models as students. Each model has five kinds of depth - 18, 34, 50, 101, and 152.

About

This is the implementaion of paper "Task-Oriented Feature Distillation"

License:MIT License


Languages

Language:Python 100.0%