Skip to content
/ CoDA Public

[ECCV 2024] Official project of CoDA: Instructive Chain-of-Domain Adaptation with Severity-Aware Visual Prompt Tuning

License

Notifications You must be signed in to change notification settings

Cuzyoung/CoDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

81 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[ECCV 2024] CoDA: Instructive Chain-of-Domain Adaptation with Severity-Aware Visual Prompt Tuning


Sun Yat-sen University

CPNT Lab

EPFL

🌟🌟🌟 Home

Here is the official project of 🎻CoDA. We are releasing the training code and dataset generated by ourselves in our paper.

CoDA is a UDA methodology that boosts models to understand all adverse scenes (☁️,☔,❄️,🌙) by highlighting the discrepancies between and within these scenes. CoDA achieves state-of-the-art performances on widely used benchmarks.

🔥🔥🔥 News

[2024-7-10]We have released our generated data samples. You can download from here.

[Baidu Netdisk]    [Google Drive]

[2024-7-2] We are delighted to inform that CoDA has been accepted by ECCV 2024 main conference 🎉🎉🎉!!!

[2024-3-8] We create the official project of CoDA and release the inference code.

Overview

night

PWC

PWC

PWC

PWC

PWC

CoDA

Experiments mIoU Checkpoint
Cityscapes $\rightarrow$ ACDC 72.6 -
Cityscapes $\rightarrow$ Foggy Zurich 60.9 -
Cityscapes $\rightarrow$ Foggy Driving 61.0 -
Cityscapes $\rightarrow$ Dark Zurich 61.2 -
Cityscapes $\rightarrow$ Nighttime Driving 59.2 -
Cityscapes $\rightarrow$ BDD100K-Night 41.6 -

If you find this project useful in your research, please consider citing:

@article{gong2024coda,
  title={CoDA: Instructive Chain-of-Domain Adaptation with Severity-Aware Visual Prompt Tuning},
  author={Gong, Ziyang and Li, Fuhao and Deng, Yupeng and Bhattacharjee, Deblina and Zhu, Xiangwei and Ji, Zhenming},
  journal={arXiv preprint arXiv:2403.17369},
  year={2024}
}

Download Checkpoint

cd CoDA
python ./tools/download_ck.py

or you can manually download checkpoints from Google Drive.

Environment

conda create -n coda python=3.8.5 pip=22.3.1
conda activate coda
pip install -r requirements.txt -f https://download.pytorch.org/whl/torch_stable.html
pip install mmcv-full==1.3.7 -f https://download.openmmlab.com/mmcv/dist/cu110/torch1.7/index.html

Before run demo, first configure the PYTHONPATH, or you may encounter error like 'can not found tools...'.

cd CoDA
export PYTHONPATH=.:$PYTHONPATH

or directly modify the .bashrc file

vi ~/.bashrc
export PYTHONPATH=your path/CoDA:$PYTHONPATH
source ~/.bashrc

demo

python ./tools/image_demo.py --img ./images/night_demo.png --config ./configs/coda/csHR2acdcHR_coda.py --checkpoint ./pretrained/CoDA_cs2acdc.pth

Inference Steps

python ./tools/image_demo.py --img_dir ./acdc_dir --config ./configs/coda/csHR2acdcHR_coda.py --checkpoint ./pretrained/CoDA_cs2acdc.pth --out_dir ./workdir/cs2acdc

Traning Steps

python ./tools/train.py --config ./configs/coda/csHR2acdcHR_coda.py --work-dir ./workdir/cs2acdc

About

[ECCV 2024] Official project of CoDA: Instructive Chain-of-Domain Adaptation with Severity-Aware Visual Prompt Tuning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages