Skip to content
master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
insightface
 
 
 
 
 
 
 
 

README.md

MarginDistillation: distillation for margin-basedsoftmax

This repository contains an implementation of the distillation methods compared in this paper. Using the code from this repository, you can train a lightweight network to recognize faces for embedded devices. The repository contains the code for the following methods:

Data preparation

  1. Download dataset https://github.com/deepinsight/insightface/wiki/Dataset-Zoo
  2. Extract images using: data_prepare/bin_get_images.ipynb
  3. Save vectors from Resnet100 using: data_prepare/save_embedings.ipynb
  4. Prepare a list for conversion to .bin file using: data_prepare/save_lst.ipynb
  5. Create file 'property' in folder with images with next content: 85742,112,112 It is: number of classes, width, height
  6. Replace face2rec2.py in insightface/blob/master/src/data/ with data_prepare/face2rec2.py from this repository Replace face_preprocess.py in insightface/blob/master/src/common/ with data_prepare/face_preprocess.py from this repository
  7. Convert to .bin file using: face2rec2.py "path to folder with 'train.lst' and 'property'"

Training

  • Resnet100 (Teacher network)

Download from google drive. Train Resnet100 with Arcface:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network r100 --loss arcface --dataset emore

Performance:

lfw cfp-fp agedb-30 megaface
99.76% 98.38% 98.25% 98.35%
  • Arcface:

Download from google drive. Train MobileFaceNet with Arcface:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss arcface --dataset emore

Performance:

lfw cfp-fp agedb-30 megaface
99.51% 92.68% 96.13% 90.62%
  • Angular distillation:

Download from google drive. Train MobileFaceNet with Angular distillation:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss angular_distillation --dataset emore_soft

Performance:

lfw cfp-fp agedb-30 megaface
99.55% 91.90% 96.01% 90.73%
  • Triplet distillation L2:

Download from google drive. Finetune MobileFaceNet with Triplet distillation L2:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss triplet_distillation_L2 --dataset emore_soft --pretrained ./models/y1-arcface-emore/model

Performance:

lfw cfp-fp agedb-30 megaface
99.56% 93.30% 96.23% 89.10%
  • Triplet distillation cos:

Download from google drive. Finetune MobileFaceNet with Triplet distillation cos:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss triplet_distillation_cos --dataset emore_soft --pretrained ./models/y1-arcface-emore/model

Performance:

lfw cfp-fp agedb-30 megaface
99.55% 93.30% 95.60% 86.52%
  • Margin based with T

Download from google drive. Train MobileFaceNet with Margin based distillation with T:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss margin_base_with_T --dataset emore_soft

Performance:

lfw cfp-fp agedb-30 megaface
99.41% 92.40% 96.01% 90.77%
  • MarginDistillation:

Download from google drive. Train MobileFaceNet with MarginDistillation:

$ CUDA_VISIBLE_DEVICES='0,1' python3 -u train.py --network y1 --loss margin_distillation --dataset emore_soft

Performance:

lfw cfp-fp agedb-30 megaface
99.61% 92.01% 96.55% 91.70%

About

MarginDistillation: distillation for margin-based softmax

Resources

Releases

No releases published

Packages

No packages published
You can’t perform that action at this time.