Skip to content
Code and model for the winner team 'trojans' in ICCV19-Lightweight Face Recognition Challenge
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
EfficientPolyFace Add files via upload Oct 4, 2019
QAN++ Add files via upload Oct 4, 2019
LICENSE Initial commit Oct 4, 2019 Update Oct 4, 2019

'Trojans' Face Recognizer

This is the model and inference code for the paper 'Towards Flops-constrained Face Recognition', which win the 1st place in the ICCV19 Lightweight Face Recognition Challenge, large video track. This repo only covers the two 30Gflops tracks (deepglint-large and iQiyi-large) that we took participate in. For more details such as network design, training strategy and bag of tricks, please refer to our workshop paper.

Download code and models

  1. Using this link to download the EfficientPolyFace and QAN++ models.
  2. Clone this repo and override the files with the same name.

Efficient PolyFace for image face recognition (deepglint-large):

  • Use this example to generate the representation (feature vector) for a single image:
python --test_img 00000f9f87210c8eb9f5fb488b1171d7.jpg --save_path ./

Enhanced quality aware network (QAN++) for video face recognition (iQiyi-large)

  1. Use this example to generate the representation and the quality for a single image. This will produce two files, which store the representation and quality respectively:
python --test_img 000005_99.jpg

Note that to generate quality for a set/sequence/video, you need to generate representations and qualities for all frames in this step by a loop.

  1. Use to generate presentations for a video.


Please cite our paper if this project helps your research.

  title={Towards Flops-constrained Face Recognition},
  author={Liu, Yu and Song, Guanglu and Zhang, Manyuan and Liu, Jihao and Zhou, Yucong and Yan, Junjie},
  booktitle={Proceedings of the ICCV Workshop},
You can’t perform that action at this time.