Skip to content

szubing/uniood

Repository files navigation

Introduction

This repostitory contains code for the paper Universal Domain Adaptation from Foundation Models: A Baseline Study. It is also a code framework for implementing Universal Domain Adaptation (UniDA) methods. One could easily add new methods, backbones, and datasets from this code framework, which are respectively built on different directories of methods, models, datasets. Follow the METHOD.md, MODEL.md, and DATASET.md instructions to build each custom module if you need.

Available algorithms

The currently available methods are:

  • Source Only (SO)

  • Universal domain adaptation through self supervision (DANCE, Saito et al., 2020)

  • Ovanet: One-vs-all network for universal domain adaptation (OVANet, Saito et al., 2021)

  • Unified optimal transport framework for universal domain adaptation (UniOT, Chang et al., 2022)

  • Learning transferable visual models from natural language supervision (CLIP zero-shot, Radford et al., 2021)

  • Robust fine-tuning of zero-shot models (WiSE-FT, Wortsman et al., 2022)

  • Multimodality helps unimodality: Cross-modal few-shot learning with multimodal models (CLIP cross-model, Lin et al., 2023)

  • Universal Domain Adaptation from Foundation Models: A Baseline Study (CLIP distillation, Bin Deng and Kui Jia, 2023)

Available datasets

The currently available datasets are:

Reproducing paper results

To reproduce the results of the paper,

(1) first, you have to prepare the datasets following the instructions on DATASET.md;

(2) then, set you default paths in configs/default.py, and extract features by runing scripts/feature.sh in the first step to speed up the process and then run other scripts;

(3) finally, report the results by runing the print_*.py --> figures and latex tables are saved in corresponding directories.

Acknowledgements

We thank CLIP cross-model for providing the CLIP text templates, and the OSR for providing the OSCR evaluation code.

Citation

If this repository helps you, please kindly cite the following bibtext:

@misc{deng2023universal,
      title={Universal Domain Adaptation from Foundation Models: A Baseline Study}, 
      author={Bin Deng and Kui Jia},
      year={2023},
      eprint={2305.11092},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

About

A code framework for implementing UniDA

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published