Skip to content

Latest commit

 

History

History

mrt

MRT Guide

MRT Pipeline

  1. Download (Customize) dataset

The source files of MRT validation needed for the currently mainstream datasets and corresponding dataset label file (under path/to/cvm-runtime/docs/mrt) are:

Dataset Name Source Files Dataset Labels
coco val2017.zip
voc VOCtest_06-Nov-2007.tar voc_labels.txt
imagenet rec/val.rec
rec/val.idx
imagenet_labels.txt
cifar10 cifar-10-binary.tar.gz
quickdraw quickdraw_X.npy
quickdraw_y.npy (is_train=True)
or
quickdraw_X_test.npy
quickdraw_y_test.npy (is_train=False)
mnist t10k-images-idx3-ubyte.gz
t10k-labels-idx1-ubyte.gz
train-images-idx3-ubyte.gz
train-labels-idx1-ubyte.gz
trec TREC.train.pk
TREC.test.pk

Or download other custom dataset if needed.

  1. Pre-process Datasets

Please refer to https://gluon-cv.mxnet.io/build/examples_datasets/index.html for reference.

For example, dataset records is needed forimagenet datasets.

run im2rec.py as described in https://gluon-cv.mxnet.io/build/examples_datasets/recordio.html#sphx-glr-build-examples-datasets-recordio-py.

  1. Predefined Models

Download predefined gluonzoo models, please refer to https://gluon-cv.mxnet.io/model_zoo/index.html for reference.

The following models has been successfully tested in MRT:

resnet50_v1
resnet50_v2
resnet18_v1
resnet18v1_b_0.89
qd10_resnetv1_20
densenet161
alexnet
cifar_resnet20_v1
mobilenet1_0
mobilenetv2_1.0
shufflenet_v1
squeezenet1.0
vgg19
trec
mnist
yolo3_darknet53_voc
yolo3_mobilenet1.0_voc
ssd_512_resnet50_v1_voc
ssd_512_mobilenet1.0_voc

Or convert other (customized) models into mxnet models if needed.

  1. Configure model

Create <your_model_name>.ini in path/to/cvm-runtime/python/mrt/model_zoo, please refer to https://github.com/CortexFoundation/cvm-runtime/blob/ryt_tune/python/mrt/model_zoo/config.example.ini for sample configuration.

  1. Run MRT

MRT includes pre-process stages, quantization stage and post process stages, including preparation, split model, calibration, quantization, merge model, evaluation and compilation. The main work done in each stage is:

Stage Main Work
prepare Model initializtion: duplicate name check, attach input shape, fuse multiple inputs, input name replacement and validate attributes of operators, fuse multiple outputs, fuse constant, fuse transpose, equivalent operator transformation and get unique parameters.
split model Split the given model with respect to the given operator names (detections models only).
calibration Calibrate the current model after setting mrt data. See https://github.com/CortexFoundation/cvm-runtime/blob/ryt_tune/docs/mrt/mrt.md for related APIs.
quantization Quantize the current model after calibration. See https://github.com/CortexFoundation/cvm-runtime/blob/ryt_tune/docs/mrt/mrt.md for related APIs.
merge model Merge the split models with respect to the given operator names (detections models only).
evaluation (Optional stage) Compare the results between the predefined model and the quantized model based on the given dataset. See https://github.com/CortexFoundation/cvm-runtime/blob/ryt_tune/docs/mrt/mrt.md for model accuracy tested.
compilation (Optional stage) Compile the quantized graph into cvm graph and dump the graph define, parameters, pre-quantize model ext and unit-batch test data.

Please run the following command under path/to/cvm-runtime/ to execute the MRT:

python python/mrt/main2.py python/mrt/model_zoo/<your_model_name>.ini

Reference

https://pypi.org/project/conda/#files

https://mxnet.apache.org/versions/master/install/index.html?platform=Linux&language=Python&processor=GPU