- Code for the paper: "Bidomain Modeling Paradigm for Pansharpening", ACM MM 2023.
- State-of-the-art (SOTA) performance on the PanCollection of remote sensing pansharpening.
Pansharpening is a challenging low-level vision task whose aim is to fuse LRMS (low-resolution multispectral image) and PAN (panchormatic image) to get HRMS (high-resolution multispectral image).
We empoly a bidomain paradigm for BiMPan, i.e., BLSM (Band-Aware Local Specificity Modeling) branch to extract local features and FGDR (Fourier Global Detail Reconstruction) branch to extract global features.
BLSM branch applies adaptive convolution to explore the local uniqueness of each band.
FDGR branch applies convolution in Fourier domain to embracing global information while benefiting the disentanglement of image degradation.
- Quantitative evalutaion results on WV3 datasets of PanCollection.
- Visual results on WV3 datasets of PanCollection.
- Datasets for pansharpening: PanCollection. The downloaded data can be placed everywhere because we do not use relative path. Besides, we recommend the h5py format, as if using the mat format, the data loading section needs to be rewritten.
- Python 3.10 (Recommend to use Anaconda)
- Pytorch 2.0
- NVIDIA GPU + CUDA
- Python packages: pip install numpy scipy h5py torchsummary
- The code is contributed by Junming Hou and Qi Cao.
- The code is highly based on the repository of [LAGConv] (https://github.com/liangjiandeng/LAGConv) and [Fourmer] (https://github.com/manman1995)
- Training and testing codes are in the current folder.
- The code for training is in main.py, while the code for testing test.py.
- For training, you need to set the file_path in the main function, adopt t your train set, validate set, and test set as well. Our code train the .h5 file, you may change it through changing the code in main function.
- As for testing, you need to set the path in both main and test function to open and load the file.
Coming soon.
If you have any questions, please feel free to contact junming_hou@seu.edu.cn, caolucas082@gmail.com.