rexnxiaobai / margindistillation

MarginDistillation: distillation for margin-based softmax

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MarginDistillation: distillation for margin-basedsoftmax

This repository contains an implementation of the distillation methods compared in this paper. Using the code from this repository, you can train a lightweight network to recognize faces for embedded devices. The repository contains the code for the following methods:

Data preparation

  1. Download dataset https://github.com/deepinsight/insightface/wiki/Dataset-Zoo
  2. Extract images using: data_prepare/bin_get_images.ipynb
  3. Save vectors from Resnet100 using: data_prepare/save_embedings.ipynb
  4. Prepare a list for conversion to .bin file using: data_prepare/save_lst.ipynb
  5. Convert to .bin file using: insightface/blob/master/src/data/dir2rec.py

Training

  • Resnet100 (Teacher network)

Download from google drive. Train Resnet100 with Arcface:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network r100 --loss arcface --dataset emore

Performance:

lfw cfp-fp agedb-30 megaface
99.76% 98.38% 98.25% 98.35%
  • Arcface:

Download from google drive. Train MobileFaceNet with Arcface:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss arcface --dataset emore

Performance:

lfw cfp-fp agedb-30 megaface
99.51% 92.68% 96.13% 90.62%
  • Angular distillation:

Download from google drive. Train MobileFaceNet with Angular distillation:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss angular_distillation --dataset emore_soft

Performance:

lfw cfp-fp agedb-30 megaface
99.55% 91.90% 96.01% 90.73%
  • Triplet distillation L2:

Download from google drive. Finetune MobileFaceNet with Triplet distillation L2:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss triplet_distillation_L2 --dataset emore_soft --pretrained ./models/y1-arcface-emore/model

Performance:

lfw cfp-fp agedb-30 megaface
99.56% 93.30% 96.23% 89.10%
  • Triplet distillation cos:

Download from google drive. Finetune MobileFaceNet with Triplet distillation cos:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss triplet_distillation_cos --dataset emore_soft --pretrained ./models/y1-arcface-emore/model

Performance:

lfw cfp-fp agedb-30 megaface
99.55% 93.30% 95.60% 86.52%
  • Margin based with T

Download from google drive. Train MobileFaceNet with Margin based distillation with T:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss margin_base_with_T --dataset emore_soft

Performance:

lfw cfp-fp agedb-30 megaface
99.41% 92.40% 96.01% 90.77%
  • MarginDistillation:

Download from google drive. Train MobileFaceNet with MarginDistillation:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss margin_distillation --dataset emore_soft

Performance:

lfw cfp-fp agedb-30 megaface
99.61% 92.01% 96.55% 91.70%

About

MarginDistillation: distillation for margin-based softmax


Languages

Language:Python 89.2%Language:Jupyter Notebook 10.8%