Knowledge distillation using Mobilenet on MS1M dataset #30
leondgarse
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Create distillation dataset using
data_distiller.py
and a MXNetr100
pretrained modelsubcenter-arcface-logs/r100-arcface-msfdrop75/model,0
. Setuse_fp16
to save asfloat16
.CUDA_VISIBLE_DEVICES='0' ./data_distiller.py -M ./mxnet_models/subcenter-arcface-logs/r100-arcface-msfdrop75/model,0 -D /datasets/ms1m-retinaface-t1_112x112_folders/ --use_fp16 -b 64
Train
Mobilenet
, and setadd_pointwise_conv=True
. In some experimentsadd_pointwise_conv
is better, some others not...Result
![Selection_362](https://user-images.githubusercontent.com/5744524/119779792-ab017400-befb-11eb-9ea0-bf15e0e6d0c8.png)
Beta Was this translation helpful? Give feedback.
All reactions