Skip to content

Code for GRSL 2022 paper. Fully Squeezed Multi-Scale Inference Network for Fast and Accurate Saliency Detection in Optical Remote Sensing Images.

Notifications You must be signed in to change notification settings

zxforchid/FSMINet

Repository files navigation

FSMINet (GRSL 2022)

Kunye Shen, Xiaofei Zhou, Bin Wan, Ran Shi, Jiyong Zhang, 'Fully Squeezed Multi-Scale Inference Network for Fast and Accurate Saliency Detection in Optical Remote Sensing Images'.

Required libraries

Python 3.7
numpy 1.18.1
scikit-image 0.17.2
PyTorch 1.4.0
torchvision 0.5.0
glob

The SSIM loss is adapted from pytorch-ssim.

Usage

  1. Clone this repo
https://github.com/Kunye-Shen/FSMINet.git
  1. We provide the predicted saliency maps (GoogleDrive or baidu extraction code: 12so.). You can download directly through the above methods, or contact us through the following email.
zxforchid@outlook.com

Architecture

FSM Module

FSM Module architecture

FSMINet

FSMINet architecture

Quantitative Comparison

Quantitative Comparison

Qualitative Comparison

Qualitative Comparison

Citation

@article{shen2022fully,
  title={Fully Squeezed Multi-Scale Inference Network for Fast and Accurate Saliency Detection in Optical Remote Sensing Images},
  author={Shen, Kunye and Zhou, Xiaofei and Wan, Bin and Shi, Ran and Zhang, Jiyong},
  journal={IEEE Geoscience and Remote Sensing Letters},
  year={2022},
  publisher={IEEE}
}

About

Code for GRSL 2022 paper. Fully Squeezed Multi-Scale Inference Network for Fast and Accurate Saliency Detection in Optical Remote Sensing Images.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages