Skip to content

ChloeeGrace/DECOR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

112 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

☀️DECOR☀️

DECOR: Dynamic Decoupling and Multi-Objective Optimization for Long-tailed Remote Sensing Image Classification

Jianlin Xie1, Guanqun Wang2*, Yin Zhuang1, Can Li1, Tong Zhang1, He Chen1, Liang Chen1, Shanghang Zhang2

1 Beijing Institute of Technology, 2 Peking University

🔥Updates

  • 🗓️May 5th, 2025: The DECOR repo has been further optimized.
  • 🗓️May 7th, 2025: The code part of the DECOR repo has been further improved.

🎯Overview & Contribution

Example Image Our main contributions are:

  • A DD (dynamic decoupling) framework is proposed that allows the model to make a better representation of learning and classifier learning, and also ensure the compatibility of the feature extractor with the classifier.
  • MOOF (multiobjective optimization framework) is proposed to make a better representation of learning including supervised contrastive learning with LFC and self-supervised contrastive learning. LFCs enable a more explicit connection between the feature extractor and the classifier. Self-supervised contrastive learning provides the model with contextual knowledge about the world.
  • A LOFT (lightweight optimization fine-tuning) is employed for the goal of maximum performance with minimal intervention.
  • A high-spatial-resolution remote sensing image long-tailed dataset containing 50 classes of objects has been constructed by ourselves and will be made publicly available to other researchers. The self-built BIT-AFGR50 is available at https://github.com/wgqqgw/BIT-KTYG-AFGR.

🧾Getting Started

1. Installation

DECOR is developed based on python==3.8.18 torch==1.8.0+cu111 and torchvision==0.9.1+cu111. Check more details in requirements.txt.

i. Clone Project

git clone https://github.com/ChloeeGrace/DECOR.git

ii. Install

pip install -r requirements.txt

iii. Download pretrain backbone weight

Download the pre-trained ResNet-50 weights, rename the file to resnet50-pre.pth, and then modify the corresponding paths.

2. Data Preparation

The file self_con.txt comprises data from ImageNet. The contents of self_con.txt are the path to the Imagenet data.

For example:

/data/Datasets/Imagenet/train_img/n04548362_10933.JPEG
/data/Datasets/Imagenet/train_img/n02364673_632.JPEG
/data/Datasets/Imagenet/train_img/n02033041_2659.JPEG
/data/Datasets/Imagenet/train_img/n03085013_30335.JPEG
/data/Datasets/Imagenet/train_img/n04532106_1429.JPEG
/data/Datasets/Imagenet/train_img/n02788148_40948.JPEG

🏋️‍♂️Training & 🤖Inference

python main_train.py

🔗Citation

@ARTICLE{10443928,
  author={Xie, Jianlin and Wang, Guanqun and Zhuang, Yin and Li, Can and Zhang, Tong and Chen, He and Chen, Liang and Zhang, Shanghang},
  journal={IEEE Transactions on Geoscience and Remote Sensing}, 
  title={DECOR: Dynamic Decoupling and Multiobjective Optimization for Long-Tailed Remote Sensing Image Classification}, 
  year={2024},
  volume={62},
  number={},
  pages={1-17},
  keywords={Feature extraction;Tail;Remote sensing;Training;Task analysis;Representation learning;Optimization;Decouple learning;long tail;remote sensing scene classification},
  doi={10.1109/TGRS.2024.3369178}}

🔔Notice

In view of everyone's interest in the long-tail distribution, we will soon release a more detailed and comprehensive version to support your research. In this detailed version, we will further integrate the Long-tailed NWPU-RESISC45 and Long-tailed AID datasets and their corresponding different parameters into different sh files, enabling them to run with one click without modifying the parameters.

📢Contact

If you have any questions, suggestions or spot a bug, feel free to get in touch. We would also love to see your contributions. Just open a pull request if you'd like to help out. Thanks so much for your support!

About

DECOR: Dynamic Decoupling and Multi-Objective Optimization for Long-tailed Remote Sensing Image Classification

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages