Skip to content

lyh-18/DegAE_DegradationAutoencoder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Degradation Autoencoder [CVPR2023 Highlight]

DegAE: A New Pretraining Paradigm for Low-level Vision

This paper is accepted by CVPR2023 (highlight). [paper]

Authors: Yihao Liu, Jingwen He, Jinjin Gu, Xiangtao Kong, Yu Qiao, Chao Dong

News

  • [2023/7/31] ⚡ We have released the codes! Please refer to the following instructions.

Method Introduction

DegAE For pretraining, the encoder accepts a degraded input image and outputs the image representation. The degraded input image is synthesized online through a series of degradation operations. The decoder accepts a reference degradation embedding, which is obtained by a degradation representor $\phi$. Then, the decoder attempts to transfer the reference degradation to the corrupted input image. During Finetuning, the decoder is replaced by one convolution layer. We finetune the whole network on downstream tasks such as image dehaze, derain and motion deblur.

examples Example results of DegAE pretraining. For instance, given an input noise image and a reference blur image, DegAE attempts to transfer the blur degradation to the input image.

Preparation

Dependencies

  • Python >= 3.6
  • Tested on PyTorch==1.13.1+cu116 (may work for other versions)
  • Tested on Ubuntu 18.04.1 LTS
  • NVIDIA GPU + CUDA (Tested on cuda11.6)

Pretrained Models

Download the pretrained models and put the downloaded models in the experiments/ folder.

Phase Task Backbone Pretrained model
Pretrain Degradation Transfer
(Pretext Task)
SwinIR
Backbone
[Baidu Disk]
(token: iugr)
[Google Drive]
Pretrain Degradation Transfer
(Pretext Task)
Restormer
Backbone
[Baidu Disk]
(token: pcpy)
[Google Drive]
Downstream
Finetune
Dehaze (ITS)
Complex Derain (Rain13K)
Motion Deblur (GoPro)
SwinIR
Backbone
[Baidu Disk]
(token: bk4a)
[Google Drive]
Downstream
Finetune
Dehaze (ITS)
Complex Derain (Rain13K)
Motion Deblur (GoPro)
Restormer
Backbone
[Baidu Disk]
(token: 7bnf)
[Google Drive]

Quick Inference

Pretrain Task: Degradation Autoencoder

Note: All the settings can be adjusted and specified in the corresponding yml file.

  1. Test the pretext task with SwinIR backbone.
cd codes
python test_DegAE_Pretrain.py -opt options/test/test_DegAE_Pretrain_SwinIR.yml
  1. Test the pretext task with Restormer backbone.
cd codes
python test_DegAE_Pretrain.py -opt options/test/test_DegAE_Pretrain_Restormer.yml

Downstream Tasks

Dehaze

  1. Test the pretrained dehaze model with SwinIR backbone.
cd codes
python test_DegAE_Finetune.py -opt options/test/test_DegAE_Finetune_Dehaze_SwinIR.yml
  1. Test the pretrained dehaze model with Restormer backbone.
cd codes
python test_DegAE_Finetune.py -opt options/test/test_DegAE_Finetune_Dehaze_Restormer.yml

Complex Derain

  1. Test the pretrained dehaze model with SwinIR backbone.
cd codes
python test_DegAE_Finetune.py -opt options/test/test_DegAE_Finetune_Derain_SwinIR.yml
  1. Test the pretrained dehaze model with Restormer backbone.
cd codes
python test_DegAE_Finetune.py -opt options/test/test_DegAE_Finetune_Derain_Restormer.yml

Motion Deblur

  1. Test the pretrained dehaze model with SwinIR backbone.
cd codes
python test_DegAE_Finetune.py -opt options/test/test_DegAE_Finetune_Deblur_SwinIR.yml
  1. Test the pretrained dehaze model with Restormer backbone.
cd codes
python test_DegAE_Finetune.py -opt options/test/test_DegAE_Finetune_Deblur_Restormer.yml

Citation

If you find our work is useful, please kindly cite it.

@InProceedings{Liu_2023_DegAE, 
author = {Liu, Yihao and He, Jingwen and Gu, Jinjin and Kong, Xiangtao and Qiao, Yu and Dong, Chao}, 
title = {DegAE: A New Pretraining Paradigm for Low-Level Vision}, 
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, 
month = {June}, 
year = {2023}, 
pages = {23292-23303} 
}

About

Codes for CVPR2023 paper "DegAE: A New Pretraining Paradigm for Low-level Vision"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published