GuangyanZhang / structure_knowledge_distillation

The official model zoo for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Structured Knowledge Distillation for Semantic Segmentation

This repository contains the source code of our paper, Structured Knowledge Distillation for Semantic Segmentation (accepted for publication in CVPR'19).

Sample results

Demo video for the student net (ESPNet) on Camvid

After distillation with mIoU 65.1: image

Before distillation with mIoU 57.8: image

Structure of this repository

This repository is organized as:

  • config This directory contains the settings.
  • dataset This directory contains the dataloader for different datasets.
  • network This directory contains a model zoo for different seg models.
  • utils This directory contains api for calculating the distillation loss and evaluate the results.

Performance on the CamVid dataset

We apply the distillation method to training the ESPnet and achieves an mIoU of 65.1 on the CamVid test set. We used the dataset splits (train/val/test) provided here. We trained the models at a resolution of 480x360.

Note: We use 2000 more unlabel data as described in our paper.

Model mIOU
ESPNet_base 57.8
ESPNet_ours 61.4
ESPNet_ours+unlabel data 65.1

Requirement

python3.5 pytorch0.41 ninja numpy cv2 Pillow You can also use this docker We recommend to use Anaconda. We have tested our code on Ubuntu 16.04.

Quick start to eval the model

  1. download the Camvid dataset 2.python eval_esp.py --method student_esp_d --dataset camvid_light --data_list $PATH_OF_THE_TEST_LIST --data_dir $PATH_OF_THE_TEST_DATA --num_classes 11 --restore-from $PATH_OF_THE_PRETRAIN_MODEL --store-output False

Model Zoo

Pretrain models can be found in the folder checkpoint

Citation

If this code is useful for your research, then please cite our paper.

@inproceedings{liu2019structured,
  title={Structured Knowledge Distillation for Semantic Segmentation},
  author={Liu, Yifan and Chen, Ke and Liu, Chris and Qin, Zengchang and Luo, Zhenbo and Wang, Jingdong},
  journal={CVPR},
  year={2019}
}

Train script

Coming soon

About

The official model zoo for the paper 'Structured Knowledge Distillation for Semantic Segmentation'. (CVPR 2019 ORAL)

License:MIT License


Languages

Language:Python 92.2%Language:Cuda 4.0%Language:C++ 3.3%Language:C 0.5%Language:Shell 0.0%