haotianll / TinyKD

Exploring Effective Knowledge Distillation for Tiny Object Detection

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TinyKD: Exploring Effective Knowledge Distillation for Tiny Object Detection

Environment

conda create -n tinykd python=3.7 -y
conda activate tinykd

conda install pytorch==1.10.0 torchvision torchaudio cudatoolkit=11.1 -c pytorch -c conda-forge -y
pip install mmcv-full==1.5.0 -f https://download.openmmlab.com/mmcv/dist/cu111/torch1.10.0/index.html

cd TinyKD/
pip install -r requirements.txt
pip install -v -e .

Training

  • TinyPerson (batch size: 4)
# teacher
bash tools/dist_train.sh configs_tiny/tinyperson/faster_rcnn_hr48_fpn_1x.py 2
# student
bash tools/dist_train.sh configs_tiny/tinyperson/faster_rcnn_r50_fpn_1x.py 2
# student + TinyKD
bash tools/dist_train.sh configs_tiny/tinyperson_kd/faster_rcnn_r50_fpn_1x_tea_hr48_kd.py 2
  • AI-TOD (batch size: 2)
# teacher
bash tools/dist_train.sh configs_tiny/aitod/aitod_faster_rcnn_hr48_1x.py 1
# student
bash tools/dist_train.sh configs_tiny/aitod/aitod_faster_rcnn_r50_1x.py 1
# student + TinyKD
bash tools/dist_train.sh configs_tiny/aitod_kd/aitod_faster_r50_1x_tea_hr48_kd.py 1

Results and models

(TODO) The pre-trained models will be publicly available soon.

Citation

If you find this code useful in your research, please consider citing:

@inproceedings{liu2023tinykd,
  title={Exploring Effective Knowledge Distillation for Tiny Object Detection},
  author={Liu, Haotian and Liu, Qing and Liu, Yang and Liang, Yixiong and Zhao, Guoying},
  booktitle={IEEE International Conference on Image Processing (ICIP)},
  pages={770--774},
  year={2023},
  organization={IEEE}
}

About

Exploring Effective Knowledge Distillation for Tiny Object Detection

License:MIT License


Languages

Language:Python 100.0%Language:Shell 0.0%