Skip to content

Implementation of NeurIPS2020 paper: Auto-Panoptic: Multi-Component Architecture Search for Panoptic Segmentation

Notifications You must be signed in to change notification settings

Jacobew/AutoPanoptic

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Auto-Panoptic: Cooperative Multi-Component Architecture Search for Panoptic Segmentation

This repository provides the implementation of NeurIPS2020 Paper:

Auto-Panoptic: Cooperative Multi-Component Architecture Search for Panoptic Segmentation, and supplementary materials can be downloaded here.

This repository is based on maskrcnn-benchmark and DetNAS.

Installation

Check INSTALL.md for installation instructions.

Data Preparation

Download and extract COCO 2017 train and val images with annotations from http://cocodataset.org. Following Panoptic-FPN, we predict 53 stuff classes plus a single ‘other’ class for all 80 thing classes for semantic segmentation. Thus we squeeze all thing classes label to id 0 and create the folder PanopticAnnotation. For architecture search, we randomly split coco train set into nas_train and nas_val set (5k images). We provide the download link here.

We expect the directory structure to be the following:

maskrcnn-benchmark/
    - datasets/
        - coco/
            - train2017/
            - val2017/
            - nas/
                - instances_nas_train2017.json
                - instances_nas_val2017.json
            - annotations/  
                - ...
                - PanopticAnnotation/ 

Supernet Pretrain on ImageNet

We pretrain our model on ImageNet using the same search space and training schedule as DetNAS. Please follow the instructions here or you can download our pretrain model autopanoptic_imagenet_pretrain.pkl directly.

Supernet Finetuning & Architecture Search on COCO

export CURRENT_DIR={your_root_dir}
cd $CURRENT_DIR
sh scripts/architecture_search.sh

We provide our search log here and searched architecture in maskrcnn-benchmark/test_models/. Note that you should change MODEL.WEIGHT to the correct pretrain model path before architecture search.

Pretrain Searched Model on ImageNet

export CURRENT_DIR={your_root_dir}
cd $CURRENT_DIR
sh scripts/pretrain_searched_model.sh

We provide our imagenet pretrain model here.

Retrain Searched Model on COCO

export CURRENT_DIR={your_root_dir}
cd $CURRENT_DIR
sh scripts/train_searched_model.sh

We provide our training log and panoptic model. Note that you should change MODEL.WEIGHT to the correct imagenet pretrain model path before retraining.

Evaluation on COCO

bash
export CURRENT_DIR={your_root_dir}
cd $CURRENT_DIR
sh scripts/eval.sh

We provide our evaluation log.

Results

Method PQ PQ_thing PQ_stuff
UPSNet 42.5 48.5 33.4
BGRNet 43.2 49.8 33.4
SOGNet 43.7 50.6 33.2
Auto-Panoptic(Ours) 44.8 51.4 35.0

Citations

@article{wu2020auto, 
    title={Auto-Panoptic: Cooperative Multi-Component Architecture Search for Panoptic Segmentation}, 
    author={Wu, Yangxin and Zhang, Gengwei, and Xu, Hang and Liang, Xiaodan and Lin, Liang}, 
    journal={Advances in Neural Information Processing Systems},
    volume={33},
    year={2020}
}

About

Implementation of NeurIPS2020 paper: Auto-Panoptic: Multi-Component Architecture Search for Panoptic Segmentation

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published