Code for paper "Convergent Learning: Do different neural networks learn the same representations?"
Jupyter Notebook Other
Latest commit 2ab72f8 Jun 21, 2016 @YixuanLi committed on GitHub upload pycache
Permalink
Failed to load latest commit information.
experiments upload pycache Jun 22, 2016
sparse_prediction Add training protos, models and results for sparse prediction Feb 29, 2016
LICENSE Initial commit Feb 28, 2016
README.md Revert "minor" Feb 29, 2016

README.md

Convergent Learning: Do different neural networks learn the same representations?

This repository contains source code necessary to reproduce the results presented in the following paper:

@inproceedings{li_2016_ICLR
  title={Convergent Learning: Do different neural networks learn the same representations?},
  author={Li, Yixuan and Yosinski, Jason and Clune, Jeff and Lipson, Hod and Hopcroft, John},
  booktitle={International Conference on Learning Representation (ICLR '16)},
  year={2016}
}

Assemble prerequisites

We used Caffe to train the models, and computed necessary statistics including pair-wise unit correlations, unit activation mean, pair-wise unit mutual information etc. In this demo, to minimize the effort for you to try out the fun experiments, we have provided a link for you to download all the necessities (pre-trained models, unit statistics, unit visualizations, pre-trained sparse prediction models).

To run through the demo, you only need the standard packages of IPython, numpy, networkx, sklearn and matplotlib packages. Depending on your setup, it may be possible to install these via pip install ipython numpy matplotlib networkx sklearn.

Run experiments with pre-computed statistics

Once the data folder is downloaded, the results can be reproduced using the included IPython notebook experiments/convergent_learning_notebook.ipynb. Start the IPython Notebook server:

$ cd experiments
$ ipython notebook

Select the convergent_learning_notebook.ipynb notebook and execute the included code.

Shortcut: to skip all the work and just see the results, take a look at this notebook with cached plots.

Questions?

Please drop me a line if you have any questions!