Skip to content

waterdisappear/SAR-JEPA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 

Repository files navigation

SAR-JEPA: A Joint-Embedding Predictive Architecture for SAR ATR

These are codes and weights of the paper:

Predicting Gradient is Better: Exploring Self-Supervised Learning for SAR ATR with a Joint-Embedding Predictive Architecture:

百度网盘: 链接:https://pan.baidu.com/s/14sRPSCygTKMelSy4ZkqRzw?pwd=jeq8 提取码:jeq8

Dataset

Dataset Size #Target #Scene Res(m) Band Polarization Description
MSAR 28,499 >4 >6 1 C Quad Ground and sea target detection dataset
SAR-Ship 39,729 >1 >4 3~25 C Quad Ship detection dataset in complex scenes
SARSim 21,168 7 3 0.3 X Single Vehicle simulation dataset
SAMPLE 5,380 10 1 0.3 X Single Vehicle simulation and measured~dataset
MSTAR 5,216 10 1 0.3 X Single Fine-grained vehicle classification dataset
FUSAR-Ship 9,830 10 >5 1.1~1.7 C Double Fine-grained ship classification dataset
SAR-ACD 2,537 6 3 1 C Single Fine-grained aircraft classification dataset

Pre-training

Our code is based on LoMaR with MAE and MaskFeat, and its enviroment is follow LoMaR.

  • This repo is based on timm==0.3.2, for which a fix is needed to work with PyTorch 1.8.1+.

  • The relative position encoding is modeled by following iRPE. To enable the iRPE with CUDA supported. Of curese, irpe can run without build.

cd rpe_ops/
python setup.py install --user

For pre-training with default setting

CUDA_VISIBLE_DEVICES=0,1,2,3  python -m torch.distributed.launch --nproc_per_node=4 --master_port=25642  main_pretrain.py --data_path ${IMAGENET_DIR}

Our main changes are in the model_lomar.py

        self.sarfeature1 = GF(nbins=self.nbins,pool=self.cell_sz,kensize=5,
                                  img_size=self.img_size,patch_size=self.patch_size)
        self.sarfeature2 = GF(nbins=self.nbins,pool=self.cell_sz,kensize=9,
                                  img_size=self.img_size,patch_size=self.patch_size)
        self.sarfeature3 = GF(nbins=self.nbins,pool=self.cell_sz,kensize=13,
                                  img_size=self.img_size,patch_size=self.patch_size)
        self.sarfeature4 = GF(nbins=self.nbins,pool=self.cell_sz,kensize=17,
                                  img_size=self.img_size,patch_size=self.patch_size)

Fine-tuning with pre-trained checkpoints

Our few-shot learning is based on Dassl. You may need to installate this and use our modified tools.py and transforms.py for SAR images. You can run MIM_finetune.sh and MIM_linear.sh.

Contact us

If you have any questions, please contact us at lwj2150508321@sina.com

@article{li2023predicting,
  title={Predicting Gradient is Better: Exploring Self-Supervised Learning for {SAR} {ATR} with a Joint-Embedding Predictive Architecture },
  author={Li, Weijie and Wei, Yang and Liu, Tianpeng and Hou, Yuenan and Liu, Yongxiang and Liu, Li},
  journal={arXiv preprint},
  url={https://arxiv.org/abs/2311.15153},
  year={2024}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published