Skip to content
Implementation of our ICCV 2017 paper: MIHash - Hashing with Mutual Information
  1. MATLAB 99.7%
  2. Other 0.3%
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
data release Sep 28, 2017
diary/batch-hashing Experiment logs for CIFAR-10 Dec 15, 2017
util release Sep 28, 2017
LICENSE Update LICENSE Sep 28, 2017 Update May 20, 2019
affinity.m release Sep 28, 2017
evaluate.m release Sep 28, 2017
startup.m release Sep 28, 2017

Hashing with Mutual Information

This repository contains Matlab implementation of the below papers:

[1] "MIHash: Online Hashing with Mutual Information",
Fatih Cakir*, Kun He*, Sarah A. Bargal, and Stan Sclaroff. (* Equal contribution)
International Conference on Computer Vision (ICCV), 2017 (arXiv)

[2] "Hashing with Mutual Information",
Fatih Cakir*, Kun He*, Sarah A. Bargal, and Stan Sclaroff. (* Equal contribution)
TPAMI 2019 (to appear) (arXiv)

The repo includes:

  • Both online/batch versions of our MIHash method from above paper. ⚠️ Note: The batch (deep) learning of MIHash implementation is updated and moved to deep-mihash.
  • An experimental framework for online hashing methods
  • Implementations of several online hashing techniques


  • Create or symlink a directory cachedir under the main directory to hold experimental results
  • Run in the data directory
  • Install or symlink VLFeat at ./vlfeat (for computing performance metrics)
  • Install or symlink MatConvNet at ./matconvnet (for batch hashing experiments)


  • In the main folder, run startup.m
  • For online hashing experiments: cd online-hashing, and run demo_online.m with appropriate input arguments (see online-hashing/
  • For batch hashing experiments: cd batch-hashing, and run demo_cifar.m with appropriate input arguments (see batch-hashing/

Batch Results

⚠️ This portion of the repo is outdated with inferior results. Please refer to the new repo for the latest results for deep/batch learning of MIHash!

Here we provide the latest results of MIHash and other competing work on CIFAR-10. For reproducibility, we also provide the parameters for MIHash we used to obtain these results (see batch-hashing/opts_batch.m).


The standard setup for CIFAR-10 has two distinct settings (as specified in the papers DTSH and MIHash). The results shown here uses the VGG-F deep learning architecture and learning is done in an end-to-end fashion. For non-deep methods this corresponds to using the features at the penultimate layer of VGG-F. (Note that differently, in the MIHash paper, we do VGG-16 single-layer experiments for setting-1).

Please refer to the above papers for details regarding setting 1 and 2.

Setting 1: Mean Average Precision

Method 12-Bits 24-Bits 32-Bits 48-Bits
BRE 0.361 0.448 0.502 0.533
MACHash 0.628 0.707 0.726 0.734
FastHash 0.678 0.729 0.742 0.757
StructHash 0.664 0.693 0.691 0.700
DPSH 0.720 0.757 0.757 0.767
DTSH 0.725 0.773 0.781 0.810
MIHash 0.687 0.788 0.7899 0.826

MIHash Parameters:

Setting 2: Mean Average Precision

Method 16-Bits 24-Bits 32-Bits 48-Bits
DPSH 0.908 0.909 0.917 0.932
DTSH 0.916 0.924 0.927 0.934
MIHash 0.922 0.931 0.940 0.942

MIHash Parameters:

NOTE: These diaries are from older versions of the repo, where different parameter names might be used. By inspection the parameters can easily be matched to opts_batch.m. Notably sigscale is equal to sigmf(1). Please email or if you have any questions.


BSD License, see LICENSE

If you use this code in your research, please cite:

  title={MIHash: Online Hashing with Mutual Information},
  author={Fatih Cakir and Kun He and Sarah A. Bargal and Stan Sclaroff},
  booktitle={IEEE International Conference on Computer Vision (ICCV)},


  • BRE: Brian Kulis and Trevor Darrell. Learning to hash with binary reconstructive embeddings. In Advances in Neural Information Processing Systems (NIPS), 2009.
  • MACHash: Ramin Raziperchikolaei and Miguel A Carreira-Perpinán. Optimizing affinity-based binary hashing using auxiliary coordinates. In Advances in Neural Information Processing Systems (NIPS), 2016
  • FastHash: Guosheng Lin, Chunhua Shen, Qinfeng Shi, Anton van den Hengel, and David Suter. Fast supervised hashing with decision trees for high-dimensional data. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014.
  • StructHash: Guosheng Lin, Fayao Liu, Chunhua Shen, Jianxin Wu, and Heng Tao Shen. Structured learning of binary codes with column generation for optimizing ranking measures. International Journal of Computer Vision (IJCV), 2016.
  • DPSH: Wu-Jun Li, Sheng Wang, and Wang-Cheng Kang. Feature learning based deep supervised hashing with pairwise labels. In Proc. International Joint Conference on Artificial Intelligence (IJCAI), 2016.
  • DTSH: Yi Wang, Xiaofang Shi and Kris M Kitani. Deep supervised hashing with triplet labels. In Proc. Asian Conference on Computer Vision (ACCV), 2016
You can’t perform that action at this time.