CACTUs-MAML: Clustering to Automatically Generate Tasks for Unsupervised Model-Agnostic Meta-Learning.
This code was used to produce the CACTUs-MAML results and baselines in the paper Unsupervised Learning via Meta-Learning.
This repository was built off of Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks.
The code was tested with the following setup:
- Ubuntu 16.04
- Python 3.5.2
- Tensorflow-GPU 1.10
You can set up a Python virtualenv, activate it, and install the dependencies like so:
virtualenv venv --python=/usr/bin/python3 source venv/bin/activate pip install -r requirements.txt
The Omniglot splits with ACAI and BiGAN encodings, MNIST splits with ACAI encodings, and miniImageNet splits with DeepCluster encodings used for the results in the paper are available here. Download and extract the archive's contents into this directory.
The CelebA dataset is not provided because of licensing issues, but code used for the CelebA experiments is still present.
You can find representative templates of scripts in
/scripts. A good one to start with is
/scripts/omniglot_kmeans.sh. Metrics can be visualized using Tensorboard. Evaluation results are saved to a .csv file in a run's log folder. All results were obtained using a single GPU.
The unsupervised representations were computed using four open-source codebases from prior works.
- Adversarial Feature Learning
- Deep Clustering for Unsupervised Learning of Visual Features
- InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets
- Understanding and Improving Interpolation in Autoencoders via an Adversarial Regularizer
To ask questions or report issues, please open an issue on the issues tracker.