Skip to content
Code for Unsupervised Learning via Meta-Learning.
Branch: master
Clone or download
Latest commit b9319fe Apr 26, 2019

README.md

CACTUs-MAML

CACTUs-MAML: Clustering to Automatically Generate Tasks for Unsupervised Model-Agnostic Meta-Learning.

This code was used to produce the CACTUs-MAML results and baselines in the paper Unsupervised Learning via Meta-Learning.

This repository was built off of Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks.

Dependencies

The code was tested with the following setup:

  • Ubuntu 16.04
  • Python 3.5.2
  • Tensorflow-GPU 1.10

You can set up a Python virtualenv, activate it, and install the dependencies like so:

virtualenv venv --python=/usr/bin/python3
source venv/bin/activate
pip install -r requirements.txt

Data

The Omniglot splits with ACAI and BiGAN encodings, MNIST splits with ACAI encodings, and miniImageNet splits with DeepCluster encodings used for the results in the paper are available here. Download and extract the archive's contents into this directory.

The CelebA dataset is not provided because of licensing issues, but code used for the CelebA experiments is still present.

Usage

You can find representative templates of scripts in /scripts. A good one to start with is /scripts/omniglot_kmeans.sh. Metrics can be visualized using Tensorboard. Evaluation results are saved to a .csv file in a run's log folder. All results were obtained using a single GPU.

Credits

The unsupervised representations were computed using four open-source codebases from prior works.

Contact

To ask questions or report issues, please open an issue on the issues tracker.

You can’t perform that action at this time.