Skip to content
/ CoSDA Public

The official implementation of our work CoSDA: Continual Source-Free Domain Adaptation.

License

Notifications You must be signed in to change notification settings

FengHZ/CoSDA

Repository files navigation

logo

Implementation for CoSDA: Continual Source-Free Domain Adaptation

Stars Issues Forks Watchers

README | 中文文档

Here is the code for our work CoSDA:Continual Source-Free Domain Adaptation. To ensure fair comparison, we build a unified codebase for the methods of source-free domain adaptation and continual DA, as shown in supported methods.

Introduction

Continual source-free domain adaptation is a new and practical task in the field of domain adaptation, which seeks to preserve the performance of a model across all domains encountered during the adaptation process while also protecting the privacy of private data, as illustrated in the following figure:

setting of continual SFDA

CoSDA is a continual source-free domain adaptation approach that employs a dual-speed optimized teacher-student model pair and is equipped with consistency learning, as shown in the following figure. The implementaion details of CoSDA are shown in [train/cosda/cosda.py].

pipeline of CoSDA

Installation and Usage

Installation of Datasets

First, download the datasets from the following links:

Next, select a base_path, create a dataset folder within it and place the downloaded files in this folder, as shown below:

base_path
├── dataset
    ├── DomainNet
    │   ├── splits
    │   ├── clipart
    │   ├── infograph
    │   ├── painting
    │   ├── quickdraw
    │   ├── real
    │   ├── sketch
    ├── Office31
    │   ├── image_list
    │   ├── amazon
    │   ├── dslr
    │   └── webcam
    ├── OfficeHome
    │   ├── image_list
    │   ├── Art
    │   ├── Clipart
    │   ├── Product
    │   ├── Real_World
    └── Visda2017
        ├── image_list
        ├── train
        └── validation

Source-free Domain Adaptation

  • Dependencies and environment setup.

    pip install -r requirements.txt
    
  • Pretrain.

    python pretrain.py -bp [base_path] --config [config_file] 
    

    The base_path is the selected location where the dataset will be installed. The config_file is stored in the [pretrain/config/backup] directory. We have designed specific configurations for GSFDA, and SHOT++, while for other methods, we use the same pretrain configuration as SHOT method used.

    Once the pretraining process is complete, the model will be saved within the base_path directory. Below is an example of the resulting file structure for DomainNet:

    base_path
    ├── DomainNet
        ├── pretrain_parameters_shot
        │   ├── source_{}_backbone_{}.pth.tar
    
  • Single-target adaptation.

    python single_tar.py -bp [base_path] --config [config_file] --writer [tensorboard / wandb]

    We have created separate configuration files for each method, which can be found in the [adaptationcfg/backup)] directory. The source_domain and target_domain can be manually specified under the DAConfig key in the configuration file.

    We provide support for two methods of recording the training process: tensorboard and wandb. To use tensorboard, you need to specify the log_path to store the event files locally using -lp [log_path]. To use wandb, you need to specify the entity using -e [entity].

  • Multi-targets adaptation.

    python multi_tar.py --config [config_file] -bp [base_path] --writer [tensorboard / wandb] (-lp [log_path] or -e [entity])

    The settings for multi-target domain adaptation are the same as those for single-target adaptation. For DomainNet, the sequential adaptation order is Real → Infograph → Clipart → Painting → Sketch → Quickdraw. For OfficeHome, the sequential adaptation order is Art → Clipart → Product → Real-world.

Supported Methods

Apart from CoSDA, we also support the following methods.

Citation

If you find our paper helpful, consider citing us via:

@article{feng2023cosda,
  title={CoSDA: Continual Source-Free Domain Adaptation},
  author={Feng, Haozhe and Yang, Zhaorui and Chen, Hesun and Pang, Tianyu and Du, Chao and Zhu, Minfeng and Chen, Wei and Yan, Shuicheng},
  journal={arXiv preprint arXiv:2304.06627},
  year={2023}
}

About

The official implementation of our work CoSDA: Continual Source-Free Domain Adaptation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages