Skip to content
Reproducing code for the paper "Learning Discrete Representations via Information Maximizing Self-Augmented Training"
Branch: master
Clone or download
Latest commit 9a9b286 Jul 20, 2017
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
mnist code Feb 28, 2017
README.md update readme Jul 20, 2017
calculate_distance.py code Feb 28, 2017
imsat_cluster.py Remove unused variables Feb 28, 2017
imsat_hash.py code Feb 28, 2017

README.md

Information Maximizing Self Augmented Training (IMSAT)

This is a reproducing code for IMSAT [1]. IMSAT is a method for discrete representation learning using deep neural networks. It can be applied to clustering and hash learning to achieve the state-of-the-art results. This is the work performed while Weihua Hu was interning at Preferred Networks.

Requirements

You must have the following already installed on your system.

  • Python 2.7
  • Chainer 1.21.0, sklearn, munkres

Quick start

For reproducing the experiments on MNIST datasets in [1], run the following codes.

  • Clustering with MNIST: python imsat_cluster.py
  • Hash learning with MNIST: python imsat_hash.py

calculate_distance.py can be used to calculate the perturbation range for Virtual Adversarial Training [2]. For MNIST dataset, we have already calculated the range.

Reference

[1] Weihua Hu, Takeru Miyato, Seiya Tokui, Eiichi Matsumoto and Masashi Sugiyama. Learning Discrete Representations via Information Maximizing Self-Augmented Training. In ICML, 2017. Available at http://arxiv.org/abs/1702.08720

[2] Takeru Miyato, Shin-ichi Maeda, Masanori Koyama, Ken Nakae, and Shin Ishii. Distributional smoothing with virtual adversarial training. In ICLR, 2016.

You can’t perform that action at this time.