Yuetong Liu, Yunqiu Xu, Yang Wei, Xiuli Bi, Bin Xiao
Restoring nighttime images affected by multiple adverse weather conditions is a practical yet under-explored research problem, as multiple weather degradations usually coexist in the real world alongside various lighting effects at night. This paper first explores the challenging multi-weather nighttime image restoration task, where various types of weather degradations are intertwined with flare effects. To support the research, we contribute the AllWeatherNight dataset, featuring large-scale nighttime images with diverse compositional degradations. By employing illumination-aware degradation generation, our dataset significantly enhances the realism of synthetic degradations in nighttime scenes, providing a more reliable benchmark for model training and evaluation. Additionally, we propose ClearNight, a unified nighttime image restoration framework, which effectively removes complex degradations in one go. Specifically, ClearNight extracts Retinex-based dual priors and explicitly guides the network to focus on uneven illumination regions and intrinsic texture contents respectively, thereby enhancing restoration effectiveness in nighttime scenarios. Moreover, to more effectively model the common and unique characteristics of multiple weather degradations, ClearNight performs weather-aware dynamic specificity and commonality collaboration that adaptively allocates optimal sub-networks associated with specific weather types. Comprehensive experiments on both synthetic and real-world images demonstrate the necessity of the AllWeatherNight dataset and the superior performance of ClearNight.
Download our AllWeatherNight dataset from Hugging Face. Please organize the dataset directory as follows:
data/
├── train/
│ ├── snow/ # Snow-related degraded input images (includes multiple weather)
│ ├── snow_gt/ # Snow-related ground truth (clean) images
│ ├── rain/ # Rain-related degraded input images (includes multiple weather)
│ ├── rain_gt/ # Rain-related ground truth (clean) images
│ ├── drop/ # Raindrop-related degraded input images (includes multiple weather)
│ └── drop_gt/ # Raindrop-related ground truth (clean) images
└── test/
├── snow/ # Snow-related test input images (snow_train_test)
├── snow_gt/ # Snow-related test ground truth images
├── rain/ # Rain-related test input images (rain_train_test)
├── rain_gt/ # Rain-related test ground truth images
├── drop/ # Raindrop-related test input images (drop_train_test)
└── drop_gt/ # Raindrop-related test ground truth images
To support the training process (perceptual loss and depth guidance), download the following weights and place them in the ./loss/ folder:
vgg16-397923af.pthencoder.pth(from ADDS-DepthNet ICCV 2021)depth.pth(from ADDS-DepthNet ICCV 2021)
To start training the ClearNight framework with Retinex decomposition:
python training_ClearNight.py --Retinex_decomp TrueTo evaluate the model performance on the test set:
python testing_ClearNight.py --Retinex_decomp TrueIf you find our work helpful for your research, please cite:
@inproceedings{aaai2026clearnight,
title={Clear Nights Ahead: Towards Multi-Weather Nighttime Image Restoration},
author={Liu, Yuetong and Xu, Yunqiu and Wei, Yang and Bi, Xiuli and Xiao, Bin},
booktitle={AAAI},
year={2026}
}If you have any questions, please contact Yuetong Liu at d230201022@stu.cqupt.edu.cn.
