- Python 3.6+ (recommended, we have not tested the code with previous versions)
- PyTorch 1.6+ (for mixed precision training)
imgaug
andimagecorruptions
libraries (refer their installation instructions)- For simplicity, we provide a
environment.yml
file extracted from our conda environment. Install using
conda env create -f environment.yml
conda activate daseg
- Download the Cityscapes dataset
- Download the GTA5 dataset
- Download the SYNTHIA dataset (RAND-CITYSCAPES version)
- Download the SYNTHIA processed labels
- Download the Synscapes dataset
- Create dataset symlinks for GTA5, SYNTHIA, Synscapes, and Cityscapes inside
datasets
folder:
ln -s /path/to/cityscape ./datasets/cityscape
ln -s /path/to/gta5 ./datasets/gta5-dataset
ln -s /path/to/synthia ./datasets/synthia_cityscape
ln -s /path/to/synscapes ./datasets/synscapes
Pretrained weights can be downloaded and copied to the checkpoints
folder in either vendorside
or clientside
(coming soon) folder as required.
- Weights from this paper's results - Google Drive
- Weights from ICCV21 baseline - Google Drive
- Use
bash eval.sh
withinvendorside
folder to evaluate any saved model weights. - Set arguments appropriately in
eval.sh
file. The important arguments are:CUDA_VISIBLE_DEVICES
: GPU ID to be used for training.model
: specifydeeplab
orfcn
for model architecture.dataset
: specifycityscapes
for evaluating on target data.load_model
: specify path to model weights to be evaluated, e.g.'./checkpoints/dl_allg_gta5.pth'
- Refer to the specific README files in
vendorside
andclientside
folders.
We are thankful to FDA, DADA, BDL and AdaptSegNet for releasing their code.
If you find our work helpful in your research, please cite the following paper:
@InProceedings{pmlr-v162-kundu22a,
title = {Balancing Discriminability and Transferability for Source-Free Domain Adaptation},
author = {Kundu, Jogendra Nath and Kulkarni, Akshay R and Bhambri, Suvaansh and Mehta, Deepesh and Kulkarni, Shreyas Anand and Jampani, Varun and Radhakrishnan, Venkatesh Babu},
booktitle = {Proceedings of the 39th International Conference on Machine Learning},
pages = {11710--11728},
year = {2022},
volume = {162},
series = {Proceedings of Machine Learning Research},
publisher = {PMLR},
}