Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a tool named get_static_model_from_algorithm and deploy guide for l1-norm #477

Merged
merged 3 commits into from
Mar 16, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions configs/pruning/mmcls/dmcp/metafile.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Models:
- Name: dmcp_resnet50_subnet_32xb64
In Collection: DMCP
Config: configs/pruning/mmcls/dmcp/dmcp_resnet50_subnet_32xb64.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/dmcp/resnet50/2G/DMCP_R50_2G.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 76.11
- Name: dmcp_mbv2_subnet_32xb64
In Collection: DMCP
Config: configs/pruning/mmcls/dmcp/dmcp_mbv2_subnet_32xb64.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/dmcp/mobilenetv2/100M/DMCP_MBV2_100M.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 67.22
19 changes: 19 additions & 0 deletions configs/pruning/mmcls/group_fisher/mobilenet/metafile.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Models:
- Name: group_fisher_act_finetune_mobilenet-v2_8xb32_in1k
In Collection: GroupFisher
Config: configs/pruning/mmcls/group_fisher/mobilenet/group_fisher_act_finetune_mobilenet-v2_8xb32_in1k.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/group_fisher/mobilenet/act/group_fisher_act_finetune_mobilenet-v2_8xb32_in1k.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 70.82
- Name: group_fisher_flops_finetune_mobilenet-v2_8xb32_in1k
In Collection: GroupFisher
Config: configs/pruning/mmcls/group_fisher/mobilenet/group_fisher_flops_finetune_mobilenet-v2_8xb32_in1k.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/group_fisher/mobilenet/flop/group_fisher_flops_finetune_mobilenet-v2_8xb32_in1k.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 70.87
19 changes: 19 additions & 0 deletions configs/pruning/mmcls/group_fisher/resnet50/metafile.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Models:
- Name: group_fisher_act_finetune_resnet50_8xb32_in1k
In Collection: GroupFisher
Config: configs/pruning/mmcls/group_fisher/resnet50/group_fisher_act_finetune_resnet50_8xb32_in1k.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/group_fisher/resnet50/act/group_fisher_act_finetune_resnet50_8xb32_in1k.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 75.22
- Name: group_fisher_flops_finetune_resnet50_8xb32_in1k
In Collection: GroupFisher
Config: configs/pruning/mmcls/group_fisher/resnet50/group_fisher_flops_finetune_resnet50_8xb32_in1k.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/group_fisher/resnet50/flops/group_fisher_flops_finetune_resnet50_8xb32_in1k.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 75.61
41 changes: 41 additions & 0 deletions configs/pruning/mmcls/l1-norm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,3 +18,44 @@ We use ItePruneAlgorithm and L1MutableChannelUnit to implement l1-norm pruning.
| ResNet34_Pruned_C | 73.89 | +0.27 | 3.40 | 7.6% | 2.02 | 7.3% | [config](./l1-norm_resnet34_8xb32_in1k_a.py) | [model](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmrazor/v1/pruning/l1-norm/l1-norm_resnet34_8xb32_in1k_c.pth) \| [log](https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmrazor/v1/pruning/l1-norm/l1-norm_resnet34_8xb32_in1k_c.json) |

**Note:** There is a different implementation from the original paper. We pruned the layers related to the shortcut with a shared pruning decision, while the original paper pruned them separately in *Pruned C*. This may be why our *Pruned C* outperforms *Prune A* and *Prune B*, while *Pruned C* is worst in the original paper.

## Getting Started

### Prune

```bash
CUDA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 PORT=29500 ./tools/dist_train.sh \
{prune_config_path}.py 8 --work-dir $WORK_DIR
```

after the pruning process, you can get a checkpoint file in the work_dir. This checkpoint file including all parameters of the original model. In the next step, we will use the checkpoint to export a pruned checkpoint.

### Get the pruned model

```bash
python ./tools/pruning/get_static_model_from_algorithm.py \
{prune_config_path}.py \
{checkpoint_file}.pth \
--o {output_folder}
```

This step will export a pruned checkpoint and a json file which records the pruning structure. This two file will be used to deploy the pruned model.

### Deploy

For a pruned model, you only need to use the pruning deploy config instead of the pretrain config to deploy the pruned version of your model. If you are not fimilar with MMDeploy, please refer to [mmdeploy](https://github.com/open-mmlab/mmdeploy/tree/1.x).

```bash
python {mmdeploy}/tools/deploy.py \
{mmdeploy}/{mmdeploy_config}.py \
{pruning_deploy_config}.py \
{pruned_checkpoint}.pth \
{mmdeploy}/tests/data/tiger.jpeg
```

### Get the Flops and Parameters of a Pruned Model

```bash
python ./tools/pruning/get_flops.py \
{pruning_deploy_config}.py
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
#############################################################################
"""You have to fill these args.

_base_(str): The path to your pretrain config file.
fix_subnet (Union[dict,str]): The dict store the pruning structure or the
json file including it.
divisor (int): The divisor the make the channel number divisible.
"""

_base_ = ['mmcls::resnet/resnet34_8xb32_in1k.py']
un_prune = 1.0
stage_ratio_1 = 0.7
stage_ratio_2 = 0.7
stage_ratio_3 = 0.7
stage_ratio_4 = un_prune

# the config template of target_pruning_ratio can be got by
# python ./tools/get_channel_units.py {config_file} --choice
fix_subnet = {
# stage 1
'backbone.conv1_(0, 64)_64': un_prune, # short cut layers
'backbone.layer1.0.conv1_(0, 64)_64': stage_ratio_1,
'backbone.layer1.1.conv1_(0, 64)_64': stage_ratio_1,
'backbone.layer1.2.conv1_(0, 64)_64': un_prune,
# stage 2
'backbone.layer2.0.conv1_(0, 128)_128': un_prune,
'backbone.layer2.0.conv2_(0, 128)_128': un_prune, # short cut layers
'backbone.layer2.1.conv1_(0, 128)_128': stage_ratio_2,
'backbone.layer2.2.conv1_(0, 128)_128': stage_ratio_2,
'backbone.layer2.3.conv1_(0, 128)_128': un_prune,
# stage 3
'backbone.layer3.0.conv1_(0, 256)_256': un_prune,
'backbone.layer3.0.conv2_(0, 256)_256': un_prune, # short cut layers
'backbone.layer3.1.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.2.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.3.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.4.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.5.conv1_(0, 256)_256': un_prune,
# stage 4
'backbone.layer4.0.conv1_(0, 512)_512': stage_ratio_4,
'backbone.layer4.0.conv2_(0, 512)_512': un_prune, # short cut layers
'backbone.layer4.1.conv1_(0, 512)_512': stage_ratio_4,
'backbone.layer4.2.conv1_(0, 512)_512': stage_ratio_4
}
divisor = 8
##############################################################################

architecture = _base_.model

model = dict(
_delete_=True,
_scope_='mmrazor',
type='GroupFisherDeploySubModel',
architecture=architecture,
fix_subnet=fix_subnet,
divisor=divisor,
)
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
#############################################################################
"""You have to fill these args.

_base_(str): The path to your pretrain config file.
fix_subnet (Union[dict,str]): The dict store the pruning structure or the
json file including it.
divisor (int): The divisor the make the channel number divisible.
"""

_base_ = ['mmcls::resnet/resnet34_8xb32_in1k.py']

un_prune = 1.0
stage_ratio_1 = 0.5
stage_ratio_2 = 0.4
stage_ratio_3 = 0.6
stage_ratio_4 = un_prune

fix_subnet = {
# stage 1
'backbone.conv1_(0, 64)_64': un_prune, # short cut layers
'backbone.layer1.0.conv1_(0, 64)_64': stage_ratio_1,
'backbone.layer1.1.conv1_(0, 64)_64': stage_ratio_1,
'backbone.layer1.2.conv1_(0, 64)_64': un_prune,
# stage 2
'backbone.layer2.0.conv1_(0, 128)_128': un_prune,
'backbone.layer2.0.conv2_(0, 128)_128': un_prune, # short cut layers
'backbone.layer2.1.conv1_(0, 128)_128': stage_ratio_2,
'backbone.layer2.2.conv1_(0, 128)_128': stage_ratio_2,
'backbone.layer2.3.conv1_(0, 128)_128': un_prune,
# stage 3
'backbone.layer3.0.conv1_(0, 256)_256': un_prune,
'backbone.layer3.0.conv2_(0, 256)_256': un_prune, # short cut layers
'backbone.layer3.1.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.2.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.3.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.4.conv1_(0, 256)_256': stage_ratio_3,
'backbone.layer3.5.conv1_(0, 256)_256': un_prune,
# stage 4
'backbone.layer4.0.conv1_(0, 512)_512': stage_ratio_4,
'backbone.layer4.0.conv2_(0, 512)_512': un_prune, # short cut layers
'backbone.layer4.1.conv1_(0, 512)_512': stage_ratio_4,
'backbone.layer4.2.conv1_(0, 512)_512': stage_ratio_4
}

divisor = 8
##############################################################################

architecture = _base_.model

model = dict(
_delete_=True,
_scope_='mmrazor',
type='GroupFisherDeploySubModel',
architecture=architecture,
fix_subnet=fix_subnet,
divisor=divisor,
)
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
#############################################################################
"""You have to fill these args.

_base_(str): The path to your pretrain config file.
fix_subnet (Union[dict,str]): The dict store the pruning structure or the
json file including it.
divisor (int): The divisor the make the channel number divisible.
"""

_base_ = ['mmcls::resnet/resnet34_8xb32_in1k.py']
un_prune = 1.0

# the config template of target_pruning_ratio can be got by
# python ./tools/get_channel_units.py {config_file} --choice
fix_subnet = {
# stage 1
'backbone.conv1_(0, 64)_64': un_prune, # short cut layers
'backbone.layer1.0.conv1_(0, 64)_64': un_prune,
'backbone.layer1.1.conv1_(0, 64)_64': un_prune,
'backbone.layer1.2.conv1_(0, 64)_64': un_prune,
# stage 2
'backbone.layer2.0.conv1_(0, 128)_128': un_prune,
'backbone.layer2.0.conv2_(0, 128)_128': un_prune, # short cut layers
'backbone.layer2.1.conv1_(0, 128)_128': un_prune,
'backbone.layer2.2.conv1_(0, 128)_128': un_prune,
'backbone.layer2.3.conv1_(0, 128)_128': un_prune,
# stage 3
'backbone.layer3.0.conv1_(0, 256)_256': un_prune,
'backbone.layer3.0.conv2_(0, 256)_256': 0.8, # short cut layers
'backbone.layer3.1.conv1_(0, 256)_256': un_prune,
'backbone.layer3.2.conv1_(0, 256)_256': un_prune,
'backbone.layer3.3.conv1_(0, 256)_256': un_prune,
'backbone.layer3.4.conv1_(0, 256)_256': un_prune,
'backbone.layer3.5.conv1_(0, 256)_256': un_prune,
# stage 4
'backbone.layer4.0.conv1_(0, 512)_512': un_prune,
'backbone.layer4.0.conv2_(0, 512)_512': un_prune, # short cut layers
'backbone.layer4.1.conv1_(0, 512)_512': un_prune,
'backbone.layer4.2.conv1_(0, 512)_512': un_prune
}

divisor = 8
##############################################################################

architecture = _base_.model

model = dict(
_delete_=True,
_scope_='mmrazor',
type='GroupFisherDeploySubModel',
architecture=architecture,
fix_subnet=fix_subnet,
divisor=divisor,
)
28 changes: 28 additions & 0 deletions configs/pruning/mmcls/l1-norm/metafile.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
Models:
- Name: l1-norm_resnet34_8xb32_in1k_a
In Collection: L1-norm
Config: configs/pruning/mmcls/l1-norm/l1-norm_resnet34_8xb32_in1k_a.py
Weights: https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmrazor/v1/pruning/l1-norm/l1-norm_resnet34_8xb32_in1k_a.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 73.61
- Name: l1-norm_resnet34_8xb32_in1k_b
In Collection: L1-norm
Config: configs/pruning/mmcls/l1-norm/l1-norm_resnet34_8xb32_in1k_b.py
Weights: https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmrazor/v1/pruning/l1-norm/l1-norm_resnet34_8xb32_in1k_b.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 73.20
- Name: l1-norm_resnet34_8xb32_in1k_c
In Collection: L1-norm
Config: configs/pruning/mmcls/l1-norm/l1-norm_resnet34_8xb32_in1k_c.py
Weights: https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmrazor/v1/pruning/l1-norm/l1-norm_resnet34_8xb32_in1k_c.pth
Results:
- Task: Image Classification
Dataset: ImageNet-1k
Metrics:
Top 1 Accuracy: 73.89
25 changes: 25 additions & 0 deletions configs/pruning/mmcls/l1-norm/script.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@

# export pruned checkpoint example

python ./tools/pruning/get_static_model_from_algorithm.py configs/pruning/mmcls/l1-norm/l1-norm_resnet34_8xb32_in1k_a.py https://openmmlab-share.oss-cn-hangzhou.aliyuncs.com/mmrazor/v1/pruning/l1-norm/l1-norm_resnet34_8xb32_in1k_a.pth -o ./work_dirs/norm_resnet34_8xb32_in1k_a

# deploy example

razor_config=configs/pruning/mmcls/l1-norm/l1-norm_resnet34_8xb32_in1k_a_deploy.py
deploy_config=mmdeploy/configs/mmcls/classification_onnxruntime_dynamic.py
static_model_checkpoint_path=path/to/pruend/checkpoint

python mmdeploy/tools/deploy.py $deploy_config \
$razor_config \
$static_model_checkpoint_path \
mmdeploy/tests/data/tiger.jpeg \
--work-dir ./work_dirs/mmdeploy

python mmdeploy/tools/profiler.py $deploy_config \
$razor_config \
mmdeploy/demo/resources \
--model ./work_dirs/mmdeploy/end2end.onnx \
--shape 224x224 \
--device cpu \
--num-iter 1000 \
--warmup 100
19 changes: 19 additions & 0 deletions configs/pruning/mmdet/group_fisher/retinanet/metafile.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
Models:
- Name: group_fisher_act_finetune_retinanet_r50_fpn_1x_coco
In Collection: GroupFisher
Config: configs/pruning/mmdet/group_fisher/retinanet/group_fisher_act_finetune_retinanet_r50_fpn_1x_coco.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/group_fisher/retinanet/act/group_fisher_act_finetune_retinanet_r50_fpn_1x_coco.pth
Results:
- Task: Object Detection
Dataset: COCO
Metrics:
box AP: 36.5
- Name: group_fisher_flops_finetune_retinanet_r50_fpn_1x_coco
In Collection: GroupFisher
Config: configs/pruning/mmdet/group_fisher/retinanet/group_fisher_flops_finetune_retinanet_r50_fpn_1x_coco.py
Weights: https://download.openmmlab.com/mmrazor/v1/pruning/group_fisher/retinanet/flops/group_fisher_flops_finetune_retinanet_r50_fpn_1x_coco.pth
Results:
- Task: Object Detection
Dataset: COCO
Metrics:
box AP: 36.6
5 changes: 5 additions & 0 deletions model-index.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,3 +21,8 @@ Import:
- configs/distill/mmdet/pkd/metafile.yml
- configs/distill/mmdet3d/pkd/metafile.yml
- configs/distill/mmcls/deit/metafile.yml
- configs/pruning/mmcls/group_fisher/mobilenet/metafile.yml
- configs/pruning/mmcls/group_fisher/resnet50/metafile.yml
- configs/pruning/mmdet/group_fisher/retinanet/metafile.yml
- configs/pruning/mmcls/l1-norm/metafile.yml
- configs/pruning/mmcls/dmcp/metafile.yml
Loading