MATLAB scripts to reproduce the results of our OPT 2009 paper
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.

Support Page for "Super-Linear Convergence of Dual Augmented Lagrangian Algorithm for Sparse Learning"

This page is set up to make it easy for people to reproduce the results I presented in the above mentioned paper I presented in a workshop at NIPS 2009. Below you can download my implementation of L1-regularized logistic regression solvers based on FISTA (Beck & Teboulle, 09), OWLQN (Andrew & Gao, 07), SpaRSA (Wright et al., 09), as well as the proposed DAL algorithm.


  1. Download the package.
  2. Also download DAL ver 1.01 (or older) from here.
  3. Supposing that the above scripts are extracted in somewhere/opt09 and the DAL scripts are extracted in somewhere/dal/ (if not change the first line in s_opt09.m), open MATLAB and type:

Note that by default the number of repetitions is one (no average). You can increase this number by modifying the following line in s_opt09.m

6: nrep=1;

to 10 for example.


You should get a picture like this: results comparing DAL against FISTA, OWLQN, and SpaRSA.

In the above figure, the convergence of four algorithms is compared in terms of the Euclidian norm ||w-w*|| (w* is the true minimizer) and the function value residual f(w)-f(w*) against number of steps (left) and computation time (right). The four algorithms are DAL, FISTA, OWLQN, and SpaRSA. DAL is the fastest both in terms of the number of (outer) steps and the computation time.