Skip to content

blanclist/ICNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ICNet: Intra-saliency Correlation Network for Co-Saliency Detection

This repository is the official PyTorch implementation of our NeurIPS(2020) paper.

You can switch the branch to "CN", to view README.md (Chinese version) and obtain codes with Chinese comments.

(您可以将 branch 切换到 "CN",以查看中文版 README.md 并获取带有中文注释的代码)

Training Datasets

Our training set is a subset of the COCO dataset, containing 9213 images.

Test Datasets

Used in our paper:

  • MSRC (7 groups, 233 images) ''Object Categorization by Learned Universal Visual Dictionary, ICCV(2005)''

  • iCoseg (38 groups, 643 images) ''iCoseg: Interactive Co-segmentation with Intelligent Scribble Guidance, CVPR(2010)''

  • Cosal2015 (50 groups, 2015 images) Detection of Co-salient Objects by Looking Deep and Wide, IJCV(2016)''

You can download them from:

Released recently:

  • CoSOD3k (160 groups, 3316 images) ''Taking a Deeper Look at the Co-salient Object Detection, CVPR(2020)''

  • CoCA (80 groups, 1295 images) ''Gradient-Induced Co-Saliency Detection, ECCV(2020)''

Pre-trained Model

We provide pre-trained ICNet based on SISMs produced by pre-trained EGNet (VGG16-based).

Prediction Results

We release the co-saliency maps (predictions) generated by our ICNet on 5 benchmark datasets:

MSRC, iCoseg, Cosal2015, CoCA, and CoSOD3k.

  • cosal-maps.zip (results of size 224*224, 20MB), GoogleDrive | BaiduYun (fetch code: du5e).

  • cosal-maps-os.zip (results resized to original sizes, 62MB), GoogleDrive | BaiduYun (fetch code: xwcv).

Training and Test

Prepare SISMs

Our ICNet can be trained and tested based on SISMs produced by any off-the-shelf SOD method, but you are suggested to use the same SOD method to generate SISMs in training and test phases to keep the consistency.

In our paper, we choose the pre-trained EGNet (VGG16-based) as the basic SOD method to produce SISMs, you can downloaded these SISMs directly from:

Training

  1. Download pre-trained VGG16 from:

  2. Follow instructions in "./ICNet/train.py" to modify training settings.

  3. Run:

python ./ICNet/train.py

Test

    • Test pre-trained ICNet:

      Download pre-trained ICNet "ICNet_vgg16.pth" (the download link is given above).

    • Test ICNet trained by yourself:

      Choose the checkpoint file "Weights_i.pth" (saved after i-th epoch automatically) you want to load for test.

  1. Follow instructions in "./ICNet/test.py" to modify test settings.

  2. Run:

python ./ICNet/test.py

Evaluation

The folder "./ICNet/evaluator/" contains evaluation codes implemented in PyTorch (GPU-version), the metrics include max F-measure, S-measure, and MAE.

  1. Follow instructions in "./ICNet/evaluate.py" to modify evaluation settings.

  2. Run:

python ./ICNet/evaluate.py

Compared Methods

We compare our ICNet with 7 state-of-the-art Co-SOD methods:

  • CBCS ''Cluster-Based Co-Saliency Detection, TIP(2013)''​

  • CSHS ''Co-Saliency Detection Based on Hierarchical Segmentation, SPL(2014)''

  • CoDW ''Detection of Co-salient Objects by Looking Deep and Wide, IJCV(2016)''

  • UCSG ''Unsupervised CNN-based Co-Saliency Detection with Graphical Optimization, ECCV(2018)''

  • CSMG ''Co-saliency Detection via Mask-guided Fully Convolutional Networks with Multi-scale Label Smoothing, CVPR(2019)''

  • MGLCN ''A Unified Multiple Graph Learning and Convolutional Network Model for Co-saliency Estimation, ACM MM(2019)''

  • GICD ''Gradient-Induced Co-Saliency Detection, ECCV(2020)''

You can download predictions of these methods from:

Citation

To be updated.

Contact

If you have any questions, feel free to contact me (Wen-Da Jin) at jwd331@126.com, I will reply as soon as possible.

About

ICNet: Intra-saliency Correlation Network for Co-Saliency Detection, NeurIPS(2020)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages