"Unsupervised Domain Adaptation by Backpropagation" introduced a simple and effective method for accomplishing domain adaptation with SGD with a gradient reversal layer. This work was elaborated and extended in "Domain-Adversarial Training of Neural Networks". For more information as well as a link to an equivalent implementation in Caffe, see http://sites.skoltech.ru/compvision/projects/grl/.
The Blobs-DANN.ipynb
shows some basic experiments on a very simple dataset. The MNIST-DANN.ipynb
recreates the MNIST experiment from the papers on a synthetic dataset. Instructions to generate the synthetic dataset are below. To run any of the experiment code, you will need to build the gradient reversal layer.
The flip_gradient
operation is implemented in Python by using tf.gradient_override_map
to override the gradient of tf.identity
. Refer to flip_gradient.py
to see how this is implemented.
from flip_gradient import flip_gradient
# Flip the gradient of y w.r.t. x and scale by l (defaults to 1.0)
y = flip_gradient(x, l)
The MNIST-DANN.ipynb
notebook implements the MNIST experiments for the paper with the same model and optimization parameters, including the learning rate and adaptation parameter schedules. Rough results are below (more training would likely improve results - # epochs are not reported in the paper).
Method | Target acc (paper) | Target acc (our implementation) |
---|---|---|
Source Only | 0.5225 | 0.5681 |
DANN | 0.7666 | 0.7631 |
The MNIST-M dataset consists of MNIST digits blended with random color patches from the BSDS500 dataset. To generate a MNIST-M dataset, first download the BSDS500 dataset and run the create_mnistm.py
script:
curl -O http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/BSR/BSR_bsds500.tgz
python create_mnistm.py
This may take a couple minutes and should result in a mnistm_data.pkl
file containing generated images.
It would be great to add other experiments to this repository. Feel free to make a PR if you decide to recreate other results from the papers or new experiments entirely.