The code implements the "Locality-Aware Hyperspectral Classification"
$python main.py --dataset='Indian' --epoches=300 --patches=7 --band_patches=1 --mode='CAF' --weight_decay=5e-3 --flag='train' --output_dir='./logs/' --batch_size=32 --align='align' --spatial_attn
$python visualization.py
The code is built upon SpectralFormer and MAEST, thanks to their great work! If you find it is useful for your research, please kindly cite the following papers:
- Zhou et al. (2023) - Locality-Aware Hyperspectral Classification
- Hong et al. (2021). - SpectralFormer: Rethinking Hyperspectral Image Classification With Transformers
- Damian et al. (2022) - Masked Auto-Encoding Spectral–Spatial Transformer for Hyperspectral Image Classification