Skip to content
/ AST Public

Adapt or Perish: Adaptive Sparse Transformer with Attentive Feature Refinement for Image Restoration

License

Notifications You must be signed in to change notification settings

joshyZhou/AST

Repository files navigation

Adapt or Perish: Adaptive Sparse Transformer with Attentive Feature Refinement for Image Restoration (CVPR 2024)

Shihao Zhou, Duosheng Chen, Jinshan Pan, Jinglei Shi, and Jufeng Yang

News

  • Jul 16, 2025: Hugging Face Demo is available now, thanks contribution of congcong
  • Feb 27, 2024: AST has been accepted to CVPR 2024 🎉

Package dependencies

The project is built with PyTorch 1.9.0, Python3.7, CUDA11.1. For package dependencies, you can install them by:

pip install -r requirements.txt

Training

Derain

To train AST on SPAD, you can run:

sh script/train_derain.sh

Dehaze

To train AST on Densehaze, you can run:

sh script/train_dehaze.sh

Raindrop

To train AST on AGAN, you can run:

sh script/train_raindrop.sh

Evaluation

To evaluate AST, you can run:

sh script/test.sh

For evaluate on each dataset, you should uncomment corresponding line.

Results

Experiments are performed for different image processing tasks including, rain streak removal, raindrop removal, and haze removal. Here is a summary table containing hyperlinks for easy navigation:

Benchmark Pretrained model Visual Results
SPAD (code:h68m) (code:wqdg)
AGAN (code:astt) (code:astt)
Dense-Haze (code:astt) (code:astt)

Citation

If you find this project useful, please consider citing:

@inproceedings{zhou2024AST,
  title={Adapt or Perish: Adaptive Sparse Transformer with Attentive Feature Refinement for Image Restoration},
  author={Zhou, Shihao and Chen, Duosheng and Pan, Jinshan and Shi, Jinglei and Yang, Jufeng},
  booktitle={CVPR},
  year={2024}
}

Acknowledgement

This code borrows heavily from Uformer.

About

Adapt or Perish: Adaptive Sparse Transformer with Attentive Feature Refinement for Image Restoration

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published