Skip to content

Latest commit

 

History

History
 
 

regnet

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

RegNet

Designing Network Design Spaces

Abstract

In this work, we present a new network design paradigm. Our goal is to help advance the understanding of network design and discover design principles that generalize across settings. Instead of focusing on designing individual network instances, we design network design spaces that parametrize populations of networks. The overall process is analogous to classic manual design of networks, but elevated to the design space level. Using our methodology we explore the structure aspect of network design and arrive at a low-dimensional design space consisting of simple, regular networks that we call RegNet. The core insight of the RegNet parametrization is surprisingly simple: widths and depths of good networks can be explained by a quantized linear function. We analyze the RegNet design space and arrive at interesting findings that do not match the current practice of network design. The RegNet design space provides simple and fast networks that work well across a wide range of flop regimes. Under comparable training settings and flops, the RegNet models outperform the popular EfficientNet models while being up to 5x faster on GPUs.

Introduction

We implement RegNetX and RegNetY models in detection systems and provide their first results on Mask R-CNN, Faster R-CNN and RetinaNet.

The pre-trained models are converted from model zoo of pycls.

Usage

To use a regnet model, there are two steps to do:

  1. Convert the model to ResNet-style supported by MMDetection
  2. Modify backbone and neck in config accordingly

Convert model

We already prepare models of FLOPs from 400M to 12G in our model zoo.

For more general usage, we also provide script regnet2mmdet.py in the tools directory to convert the key of models pretrained by pycls to ResNet-style checkpoints used in MMDetection.

python -u tools/model_converters/regnet2mmdet.py ${PRETRAIN_PATH} ${STORE_PATH}

This script convert model from PRETRAIN_PATH and store the converted model in STORE_PATH.

Modify config

The users can modify the config's depth of backbone and corresponding keys in arch according to the configs in the pycls model zoo. The parameter in_channels in FPN can be found in the Figure 15 & 16 of the paper (wi in the legend). This directory already provides some configs with their performance, using RegNetX from 800MF to 12GF level. For other pre-trained models or self-implemented regnet models, the users are responsible to check these parameters by themselves.

Note: Although Fig. 15 & 16 also provide w0, wa, wm, group_w, and bot_mul for arch, they are quantized thus inaccurate, using them sometimes produces different backbone that does not match the key in the pre-trained model.

Results and Models

Mask R-CNN

Backbone Style Lr schd Mem (GB) Inf time (fps) box AP mask AP Config Download
R-50-FPN pytorch 1x 4.4 12.0 38.2 34.7 config model | log
RegNetX-3.2GF-FPN pytorch 1x 5.0 40.3 36.6 config model | log
RegNetX-4.0GF-FPN pytorch 1x 5.5 41.5 37.4 config model | log
R-101-FPN pytorch 1x 6.4 10.3 40.0 36.1 config model | log
RegNetX-6.4GF-FPN pytorch 1x 6.1 41.0 37.1 config model | log
X-101-32x4d-FPN pytorch 1x 7.6 9.4 41.9 37.5 config model | log
RegNetX-8.0GF-FPN pytorch 1x 6.4 41.7 37.5 config model | log
RegNetX-12GF-FPN pytorch 1x 7.4 42.2 38 config model | log
RegNetX-3.2GF-FPN-DCN-C3-C5 pytorch 1x 5.0 40.3 36.6 config model | log

Faster R-CNN

Backbone Style Lr schd Mem (GB) Inf time (fps) box AP Config Download
R-50-FPN pytorch 1x 4.0 18.2 37.4 config model | log
RegNetX-3.2GF-FPN pytorch 1x 4.5 39.9 config model | log
RegNetX-3.2GF-FPN pytorch 2x 4.5 41.1 config model | log

RetinaNet

Backbone Style Lr schd Mem (GB) Inf time (fps) box AP Config Download
R-50-FPN pytorch 1x 3.8 16.6 36.5 config model | log
RegNetX-800MF-FPN pytorch 1x 2.5 35.6 config model | log
RegNetX-1.6GF-FPN pytorch 1x 3.3 37.3 config model | log
RegNetX-3.2GF-FPN pytorch 1x 4.2 39.1 config model | log

Pre-trained models

We also train some models with longer schedules and multi-scale training. The users could finetune them for downstream tasks.

Method Backbone Style Lr schd Mem (GB) Inf time (fps) box AP mask AP Config Download
Faster RCNN RegNetX-400MF-FPN pytorch 3x 2.3 37.1 - config model | log
Faster RCNN RegNetX-800MF-FPN pytorch 3x 2.8 38.8 - config model | log
Faster RCNN RegNetX-1.6GF-FPN pytorch 3x 3.4 40.5 - config model | log
Faster RCNN RegNetX-3.2GF-FPN pytorch 3x 4.4 42.3 - config model | log
Faster RCNN RegNetX-4GF-FPN pytorch 3x 4.9 42.8 - config model | log
Mask RCNN RegNetX-400MF-FPN pytorch 3x 2.5 37.6 34.4 config model | log
Mask RCNN RegNetX-800MF-FPN pytorch 3x 2.9 39.5 36.1 config model | log
Mask RCNN RegNetX-1.6GF-FPN pytorch 3x 3.6 40.9 37.5 config model | log
Mask RCNN RegNetX-3.2GF-FPN pytorch 3x 5.0 43.1 38.7 config model | log
Mask RCNN RegNetX-4GF-FPN pytorch 3x 5.1 43.4 39.2 config model | log
Cascade Mask RCNN RegNetX-400MF-FPN pytorch 3x 4.3 41.6 36.4 config model | log
Cascade Mask RCNN RegNetX-800MF-FPN pytorch 3x 4.8 42.8 37.6 config model | log
Cascade Mask RCNN RegNetX-1.6GF-FPN pytorch 3x 5.4 44.5 39.0 config model | log
Cascade Mask RCNN RegNetX-3.2GF-FPN pytorch 3x 6.4 45.8 40.0 config model | log
Cascade Mask RCNN RegNetX-4GF-FPN pytorch 3x 6.9 45.8 40.0 config model | log

Notice

  1. The models are trained using a different weight decay, i.e., weight_decay=5e-5 according to the setting in ImageNet training. This brings improvement of at least 0.7 AP absolute but does not improve the model using ResNet-50.
  2. RetinaNets using RegNets are trained with learning rate 0.02 with gradient clip. We find that using learning rate 0.02 could improve the results by at least 0.7 AP absolute and gradient clip is necessary to stabilize the training. However, this does not improve the performance of ResNet-50-FPN RetinaNet.

Citation

@article{radosavovic2020designing,
    title={Designing Network Design Spaces},
    author={Ilija Radosavovic and Raj Prateek Kosaraju and Ross Girshick and Kaiming He and Piotr Dollár},
    year={2020},
    eprint={2003.13678},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}