Skip to content

Official TensorFlow implementation for "Supervised Domain Adaptation: A Graph Embedding Perspective and a Rectified Experimental Protocol" [TIP 2021] and "Supervised Domain Adaptation using Graph Embedding" [ICPR 2020]

Notifications You must be signed in to change notification settings

LukasHedegaard/dage

Repository files navigation

Domain Adaption using Graph Embedding (DAGE)

Paper Paper Conference

Official repository for the the supervised domain adaptation method Domain Adaptation using Graph Embedding (DAGE).


In addition to our DAGE-LDA method, we provide implementations of Fine-tuning with gradual unfreeze, and the supervised domain adaptation methods CCSA and d-SNE

Setup

Install dependencies

$ pip install -r requirements.txt

If issues are encountered with scikit-optimize, the exact git tag can be installed via: pip install git+https://github.com/scikit-optimize/scikit-optimize.git@af5450a51599bbfa4846342188948c147ceba14c

Download datasets

$ ./scripts/get_office.sh
$ ./scripts/get_digits.sh
$ ./scripts/get_visda.sh

Running the code

run.py is the entry-point for running the implemented methods. To retreive a list of valid arguments, use python run.py --help.

A number of ready-to-run scripts are supplied (found in the scripts folder), with which one can test different methods and configurations.

An example which tunes a model on source data, and tests on target data is

$ ./scripts/office31_tune_source.sh

Running DAGE of Office31 with tuned hyperparameters using the revised training splits is acheived by using.

$ ./scripts/office31_dage_lda_tuned_vgg16_v2.sh

Note: The Office31 experiments were run on two separate occations. The first time was using the standard approach used in much domain adaptation literature. The second time (the accompaning scripts are postfixed with "_v2") the revised data splits were used to ensure generaliseability of the results. For Digits and VisDA, the experiments follow the recitifed protocol.

Hyper-parameter optimisation

hypersearch.py can be used to perform a hyper-parameter search using Bayesian Optimisation. Script are also supplied for performing a hyperparameter optimisation

$ ./scripts/office31_hypersearch.sh

Results

Office31

The results come in two flavours. One set of results are for the traditional experimental setup, where the test split is used for validation. The other set are for a rectified experimental protocol, in which the test set is used only for final testing and a propper validation split is defined.


Digits



VisDA-C


Authors

Citation

@article{hedegaard2021supervised,
    author={Hedegaard, Lukas and Sheikh-Omar, Omar Ali and Iosifidis, Alexandros},
    journal={IEEE Transactions on Image Processing}, 
    title={Supervised Domain Adaptation: A Graph Embedding Perspective and a Rectified Experimental Protocol}, 
    year={2021},
    volume={30},
    number={},
    pages={8619-8631},
    doi={10.1109/TIP.2021.3118978}
}
@article{hedegaard2020supervised,
    title={Supervised Domain Adaptation using Graph Embedding},
    author={Lukas Hedegaard and Omar Ali Sheikh-Omar and Alexandros Iosifidis},
    journal={International Conference on Pattern Recognition (ICPR)},
    year={2020},
}

About

Official TensorFlow implementation for "Supervised Domain Adaptation: A Graph Embedding Perspective and a Rectified Experimental Protocol" [TIP 2021] and "Supervised Domain Adaptation using Graph Embedding" [ICPR 2020]

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published