Skip to content

Commit

Permalink
[Feature] Regression test for mmdeploy (open-mmlab#302)
Browse files Browse the repository at this point in the history
* Add regression test script

* Add doc

* Add test yaml for mmdet

* evaluate_outputs Add return result

(cherry picked from commit c8c9cd75df7916aa4d80a7b8bfb8a78e04446cad)

* object_detection return metric after eval

(cherry picked from commit 1b8dcaa39ed14f016bf51d1aee65c2c416cd7c33)

* move `deploy_config_dir` to `global_info` in test yaml

* fix path error

* Improve test yaml structure

* Add test env for saving regression report

* Fix SDK test report will crash

* Get SDK FPS

* Add mmcls regression test yaml

* Using CMD to test the backend result

* Get metric from log file

* Imporve coding

* Imporve coding

* restructure test yaml

* resturcture the test yaml and coding, using pipeline style

* Fixed wont saving into report when cant find `backend_test` and `sdk_config`

* set `metric_info` in test yaml

* improve test yaml

* Fixed will get black checkpoint file name

* Fix lint

* Fix yaml

* Add common in test yaml

* Resturcture mmcla test yaml

* Resturcture mmcla test yaml

* Improve mmcls test yaml

* mmcls test success

* Improve tes yaml field

* Add `--test-img` only when `test_img_path` is not None

* Add `precision_type` in report

* Not saving pkl result file any more

* Add 'x' install of '-' when script crash

* Fix some field in mmcls test yaml

* Add mmseg test yaml

* Add unknown backend final file name

* Improve backend file dict

* Add mmseg success

* unify the checkpoint path to relate path

* unify the checkpoint path to relate path

* Add mmpose, need to test

* Support backend file list to `--model` when test the backend

* Fix lint

* Add some common

* FPS get from log always get 1:10 line

* Add dataset in report when test backend

* Get dataset type from model config file

* Replace pipeline.json topk

* SDK report add backend name

* Add txt report, it will save each test

* update mmcls config

* Add `calib-dataset-cfg` in cmd when it exist  in tset yaml

* make model path shorter by cutting the work_dir_root

* Add `task_name` in test yaml

* Add `task_name` in report

* Improve test yaml

* Add mmocr test yaml

* Get mmocr fps metric success

* Add `dataset` feild in test yaml

* Report will skip when the dataset name not in test yaml

* Add dbnet in mmocr test yaml and success get metric

* Add mmedit test yaml

* Improve some common

* Add mmedit success

* Fix lint

* Fix isort lint

* Fix yapf lint

* Undo some changes in `evaluate_outputs`

* Undo some changes in `evaluate_outputs`

* Improve test requirement.txt

* Undo some changes in `evaluate_outputs`

* Improve doc

* Improve mmedit test yaml

* Using `--divice`

* Fix lint

* Using `--performance` replace `--test-type`

* Fix lint

* Fix page link

* Fix backend name

* Using `logger` instead of `print`

* Fix lint

* Add TorchScript in the doc

* Add type hint for all the funcs

* Fix docformatter lint

* Fix path in report have the root of work dir

* mmdet add other backend in tast yaml

* mmdet add other backend in tast yaml

* mmocr add other backend in tast yaml

* mmedit add other backend in tast yaml

* mmpose add other backend in tast yaml

* Delete filed `codebase_model_config_dir` in test yaml

* Using `Config` in metafile.yml instal of `Name` from allmodel config files.

* Fix yapf lint

* update mmpose mmseg config

* Fix lint

* Imporve mmcls test yaml

* Imporve mmedit test yaml

* Imporve mmedit test yaml

* Imporve mmseg test yaml

* update mmdet yml

* Not using pth when conver sucess when in only convert mode

* Using metafile dataset when can not get `model_cfg.dataset_type`

* Fixed `model_name` incorrect in some codebase

* Improve mmcls test yaml image

* Improve mmedit test yaml image

* Improve mmocr test yaml image

* Improve mmseg test yaml image

* Fix test yaml bug

* Support overwirte `metric_tolerance`

* Add `metric_tolerance` in mmcls

* Fixed yaml bug

* mmcls add all models, which had already supported, in test yaml

* Fix report will not replace to ${WORK_DIR}

* Add metric tolerance in mmcls test yaml

* Modefied mmcls global metric tolreance

* remove `metric_tolerance` in each pipeline

* Improve mmcls test yaml

* mmcls add TODO

* imporve test yaml `pp`l -> `pplnn`

* mmdet add all models, which had already supported, in test yaml

* mmedit add all models, which had already supported, in test yaml

* mmocr add all models, which had already supported, in test yaml

* mmpose add all models, which had already supported, in test yaml

* Replace `sdk_xxx_fp32` to `sdk_xxx`

* mmseg add all models, which had already supported, in test yaml

* Import the backend order of test yaml

* Fix mmseg test yaml pplnn static config

* Add notes for Windows user in the doc

* Fix get metric from different metric name but correct dataset name.

* Fix mmedit dataset incorrect

* Fix test yaml of mmedit

* Fix lint

* Fix mmpose can't find metric

* Improve mmseg trt testing config

* Add dataset in mmdet test yaml

* Add logs when continue in the code

* Fix mmpose get metric log error

* Fix mmdet can not get metric

* Add merge report into one

* Fix lint

* Add version in the report

* log2file for each backend

* Fix lint

* Improve report merge

* update mmseg yml to shape 1024x2048

* Fix dependences for merging report

* Imporve mmcls yaml, add `ShuffleNetV1` and `ShuffleNetV2`

* Fix name of model in test yaml with a space will crash when convert

* Add commons for test yaml metric tolerance

* Add mmdet seg detail config in test yaml

* Improve mmdet test yaml

* Fix mmdet mskrcnn metric

Co-authored-by: maningsheng <mnsheng@yeah.net>
  • Loading branch information
PeterH0323 and RunningLeon committed Apr 28, 2022
1 parent 2c2d1e5 commit 2265217
Show file tree
Hide file tree
Showing 11 changed files with 2,718 additions and 1 deletion.
190 changes: 190 additions & 0 deletions configs/mmcls/mmcls_regression_test.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,190 @@
globals:
codebase_name: mmcls
codebase_dir: ../mmclassification
checkpoint_force_download: False
checkpoint_dir: ../mmdeploy_checkpoints
images:
img_snake: &img_snake ../mmclassification/demo/demo.JPEG
img_bird: &img_bird ../mmclassification/demo/bird.JPEG
img_cat_dog: &img_cat_dog ../mmclassification/demo/cat-dog.png
img_dog: &img_dog ../mmclassification/demo/dog.jpg
img_color_cat: &img_color_cat ../mmclassification/tests/data/color.jpg
img_gray_cat: &img_gray_cat ../mmclassification/tests/data/gray.jpg

metric_info: &metric_info
Top 1 Accuracy: # named after metafile.Results.Metrics
eval_name: accuracy # test.py --metrics args
metric_key: accuracy_top-1 # eval Dict key name
tolerance: 1 # metric ±n%
task_name: Image Classification # metafile.Results.Task
dataset: ImageNet-1k # metafile.Results.Dataset
Top 5 Accuracy:
eval_name: accuracy
metric_key: accuracy_top-5
tolerance: 1 # metric ±n%
task_name: Image Classification
dataset: ImageNet-1k
convert_image: &convert_image
input_img: *img_snake
test_img: *img_color_cat
backend_test: &default_backend_test True
sdk:
sdk_dynamic: &sdk_dynamic configs/mmcls/classification_sdk_dynamic.py


onnxruntime:
pipeline_ort_static_fp32: &pipeline_ort_static_fp32
convert_image: *convert_image
backend_test: *default_backend_test
sdk_config: *sdk_dynamic
deploy_config: configs/mmcls/classification_onnxruntime_static.py

pipeline_ort_dynamic_fp32: &pipeline_ort_dynamic_fp32
convert_image: *convert_image
backend_test: False
deploy_config: configs/mmcls/classification_onnxruntime_dynamic.py


tensorrt:
pipeline_trt_static_fp32: &pipeline_trt_static_fp32
convert_image: *convert_image
backend_test: *default_backend_test
deploy_config: configs/mmcls/classification_tensorrt_static-224x224.py

pipeline_trt_static_fp16: &pipeline_trt_static_fp16
convert_image: *convert_image
backend_test: *default_backend_test
deploy_config: configs/mmcls/classification_tensorrt-fp16_static-224x224.py

pipeline_trt_static_int8: &pipeline_trt_static_int8
convert_image: *convert_image
backend_test: *default_backend_test
deploy_config: configs/mmcls/classification_tensorrt-int8_static-224x224.py

pipeline_trt_dynamic_fp32: &pipeline_trt_dynamic_fp32
convert_image: *convert_image
backend_test: *default_backend_test
sdk_config: *sdk_dynamic
deploy_config: configs/mmcls/classification_tensorrt_dynamic-224x224-224x224.py

pipeline_trt_dynamic_fp16: &pipeline_trt_dynamic_fp16
convert_image: *convert_image
backend_test: *default_backend_test
sdk_config: *sdk_dynamic
deploy_config: configs/mmcls/classification_tensorrt-fp16_dynamic-224x224-224x224.py

pipeline_trt_dynamic_int8: &pipeline_trt_dynamic_int8
convert_image: *convert_image
calib_dataset_cfg:
backend_test: *default_backend_test
sdk_config: *sdk_dynamic
deploy_config: configs/mmcls/classification_tensorrt-int8_dynamic-224x224-224x224.py


openvino:
pipeline_openvino_dynamic_fp32: &pipeline_openvino_dynamic_fp32
convert_image: *convert_image
backend_test: False
deploy_config: configs/mmcls/classification_openvino_dynamic-224x224.py


ncnn:
pipeline_ncnn_static_fp32: &pipeline_ncnn_static_fp32
convert_image: *convert_image
backend_test: False
deploy_config: configs/mmcls/classification_ncnn_static.py

pipeline_ncnn_dynamic_fp32: &pipeline_ncnn_dynamic_fp32
convert_image: *convert_image
backend_test: False
deploy_config: configs/mmcls/classification_ncnn_dynamic.py


pplnn:
pipeline_pplnn_dynamic_fp32: &pipeline_pplnn_dynamic_fp32
convert_image: *convert_image
backend_test: False
deploy_config: configs/mmcls/classification_pplnn_dynamic-224x224.py


torchscript:
pipeline_ts_fp32: &pipeline_ts_fp32
convert_image: *convert_image
backend_test: False
deploy_config: configs/mmcls/classification_torchscript.py


models:
- name: ResNet
metafile: configs/resnet/metafile.yml
model_configs:
- configs/resnet/resnet18_8xb32_in1k.py # TODO Not benchmark config
pipelines:
- *pipeline_ts_fp32
- *pipeline_ort_dynamic_fp32
# - *pipeline_trt_dynamic_fp32
- *pipeline_trt_dynamic_fp16
# - *pipeline_trt_dynamic_int8
- *pipeline_ncnn_static_fp32
- *pipeline_pplnn_dynamic_fp32
- *pipeline_openvino_dynamic_fp32

- name: ResNeXt
metafile: configs/resnext/metafile.yml
model_configs:
- configs/resnext/resnext50-32x4d_8xb32_in1k.py # TODO Not benchmark config
pipelines:
- *pipeline_ts_fp32
- *pipeline_ort_dynamic_fp32
- *pipeline_trt_dynamic_fp16
- *pipeline_ncnn_static_fp32
- *pipeline_pplnn_dynamic_fp32
- *pipeline_openvino_dynamic_fp32

- name: SE-ResNet
metafile: configs/seresnet/metafile.yml
model_configs:
- configs/seresnet/seresnet50_8xb32_in1k.py # TODO Not benchmark config
pipelines:
- *pipeline_ts_fp32
- *pipeline_ort_dynamic_fp32
- *pipeline_trt_dynamic_fp16
- *pipeline_ncnn_static_fp32
- *pipeline_pplnn_dynamic_fp32
- *pipeline_openvino_dynamic_fp32

- name: MobileNetV2
metafile: configs/mobilenet_v2/metafile.yml
model_configs:
- configs/mobilenet_v2/mobilenet-v2_8xb32_in1k.py
pipelines:
- *pipeline_ts_fp32
- *pipeline_ort_dynamic_fp32
- *pipeline_trt_dynamic_fp16
- *pipeline_ncnn_static_fp32
- *pipeline_pplnn_dynamic_fp32
- *pipeline_openvino_dynamic_fp32

- name: ShuffleNetV1
metafile: configs/shufflenet_v1/metafile.yml
model_configs:
- configs/shufflenet_v1/shufflenet-v1-1x_16xb64_in1k.py
pipelines:
- *pipeline_ts_fp32
# - *pipeline_ort_static_fp32
- *pipeline_trt_static_fp16
- *pipeline_ncnn_static_fp32
# - *pipeline_pplnn_dynamic_fp32
# - *pipeline_openvino_dynamic_fp32

- name: ShuffleNetV2
metafile: configs/shufflenet_v2/metafile.yml
model_configs:
- configs/shufflenet_v2/shufflenet-v2-1x_16xb64_in1k.py
pipelines:
- *pipeline_ts_fp32
# - *pipeline_ort_static_fp32
- *pipeline_trt_static_fp16
- *pipeline_ncnn_static_fp32
# - *pipeline_pplnn_dynamic_fp32
# - *pipeline_openvino_dynamic_fp32
Original file line number Diff line number Diff line change
@@ -1 +1 @@
_base_ = ['../_base_/base_openvino_dynamic.py']
_base_ = ['../_base_/base_openvino_dynamic-300x300.py']
Loading

0 comments on commit 2265217

Please sign in to comment.