meduz edited this page Jan 8, 2012 · 2 revisions
Clone this wiki locally

Table of Contents

SHL spcripts


  • This is a collection of matlab (c) scripts to test learning strategies to efficiently code natural image patches. This is here restricted to the framework of the !SparseNet algorithm from Bruno Olshausen.
  Author = {Perrinet, Laurent U.},
  Title = {Role of homeostasis in learning sparse representations},
  Journal = {Neural Computation},
  Doi = {10.1162/neco.2010.05-08-795},
  Keywords = {Neural population coding, Unsupervised learning, Statistics of natural images, Simple cell receptive fields, Sparse Hebbian Learning, Adaptive Matching Pursuit, Cooperative Homeostasis, Competition-Optimized Matching Pursuit},
  Month = {July},
  Number = {7},
  Url = {http://invibe.net/LaurentPerrinet/Publications/Perrinet10shl},
  Volume = {22},
  Year = {2010},
  Annote = {Posted Online March 17, 2010.},

Get Ready!

Be sure to have :

  • a computer (tested on Mac, Linux, Irix, Windows2k) with Matlab (tested on R13 and R14, 2007, R2009a) or Octave (get Octave > 3.0 to get ). You will not need any special toolbox.
  • grab the sources from the zip file. Then:
    • if needed (that is, if the code does break complaining it does not find the function), compile the routines used by B. Olshausen compiled for your platform (some compiled mex routines are included)
    • to generate PDFs, you have to get the script (see )
    • the source files and may be found in the src folder,
    • to generate the final report, you'll need a TeX distribution with the program and the beamer package,
    • you will need to have a set of decorrelated images in your folder (its provided in the zip file, but you may make your own using the script),
    • These scripts should be platform independent, however, there is a heavy bias toward UN*X users when generating figures (I haven't tried to generate the figures on windows systems). In particular, it is designed to generate figures in the background as PDF (on a headless cluster), and no window from MATLAB should pop up.

Instructions for running the experiments / understanding the scripts

  • First, if you just want to experiment with the learning scheme using Competition-Optimized Matching Pursuit, go to the scripts folder and run
  • Simply run one of the files of your interest ---for example to test the role of parameters in the learning scheme with CGF--- (or the whole collection in ) and edit it to change the parameters of the experiments. This will create a set of pdf figures in a dated folder depending on your preferences (see )
  • the script points to the different experiments. This produces a report using pdflatex: (see results.pdf).
  • Notation is kept from the !SparseNet package. Remember for the variables : n=network ; e=experiment; s=stats

  • on a multicore machine, you may try something like:
  for i in {1..8}; do cd /Volumes/perrinet/sci/dyva/lup/Learning/SHL_scripts/code && sleep 0.$(( RANDOM%1000 )) ; matlab -nodisplay < Contents20100322T151819.m & done
  for i in {1..6}; do cd /master0/perrinet/sci/dyva/lup/Learning/SHL_scripts/code && sleep 0.$(( RANDOM%1000 )) ; matlab -nodisplay < Contents20100322T151819.m & done
  for i in {1..4}; do cd /data/work/perrinet/sci/dyva/Learning/SHL_scripts/code && sleep 0.$(( RANDOM%1000 )) ; octave Contents20100322T151819.m & cexec 'cd /data/work/perrinet/sci/dyva/Learning/SHL_scripts/code && sleep 0.$(( RANDOM%100 )) ; octave Contents20100322T151819.m' & done


  • code : the scripts (see Contents.m for a script pointing to the different experiments)
  • results : the individual experiments
  • data : a folder containing the image files (you van get them independently by downloading attachment:data.zip)
  • src : some other package that may be of use