Skip to content
Code for Unsupervised Learning via Meta-Learning.
Python Shell
Branch: master
Clone or download
Latest commit b9319fe Apr 26, 2019


CACTUs-MAML: Clustering to Automatically Generate Tasks for Unsupervised Model-Agnostic Meta-Learning.

This code was used to produce the CACTUs-MAML results and baselines in the paper Unsupervised Learning via Meta-Learning.

This repository was built off of Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks.


The code was tested with the following setup:

  • Ubuntu 16.04
  • Python 3.5.2
  • Tensorflow-GPU 1.10

You can set up a Python virtualenv, activate it, and install the dependencies like so:

virtualenv venv --python=/usr/bin/python3
source venv/bin/activate
pip install -r requirements.txt


The Omniglot splits with ACAI and BiGAN encodings, MNIST splits with ACAI encodings, and miniImageNet splits with DeepCluster encodings used for the results in the paper are available here. Download and extract the archive's contents into this directory.

The CelebA dataset is not provided because of licensing issues, but code used for the CelebA experiments is still present.


You can find representative templates of scripts in /scripts. A good one to start with is /scripts/ Metrics can be visualized using Tensorboard. Evaluation results are saved to a .csv file in a run's log folder. All results were obtained using a single GPU.


The unsupervised representations were computed using four open-source codebases from prior works.


To ask questions or report issues, please open an issue on the issues tracker.

You can’t perform that action at this time.