Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Dual Augmented Lagrangian (DAL) algorithm for sparse/low-rank reconstruction and learning
Matlab M
branch: master

Fixed version line.

latest commit 9088550d3e
Ryota Tomioka authored
Failed to load latest commit information.
matlab Fixed version line.
README.md Update README.md
license.txt Initial commit. (ver 1.05)
mvar.png Added mvar.png.

README.md

DAL

Dual Augmented Lagrangian (DAL) algorithm for sparse/low-rank reconstruction and learning. For more details, check out our JMLR paper, book chapter, talk, and slides.

Examples

L1-regularized squared-loss regression (LASSO)

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 lambda=0.1*max(abs(A'*bb));
 [ww,stat]=dalsql1(zeros(n,1), A, bb, lambda);

L1-regularized logistic regression

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); yy=sign(A*w0+0.01*randn(m,1));
 lambda=0.1*max(abs(A'*yy));
 [ww,bias,stat]=dallrl1(zeros(n,1), 0, A, yy, lambda);

Grouped-L1-regularized logistic regression

 m = 1024; n = [64 64]; k = round(0.1*n(1)); A=randn(m,prod(n));
 w0=randsparse(n,k); yy=sign(A*w0(:)+0.01*randn(m,1));
 lambda=0.1*max(sqrt(sum(reshape(A'*yy/2,n).^2)));
 [ww,bias,stat]=dallrgl(zeros(n), 0, A, yy, lambda);

Trace-norm-regularized logistic regression

 m = 2048; n = [64 64]; r = round(0.1*n(1)); A=randn(m,prod(n));
 w0=randsparse(n,'rank',r); yy=sign(A*w0(:)+0.01*randn(m,1));
 lambda=0.2*norm(reshape(A'*yy/2,n));
 [ww,bias,stat]=dallrds(zeros(n), 0, A, yy, lambda);

Noisy matrix completion

 n = [640 640]; r = 6; m = 2*r*sum(n);
 w0=randsparse(n,'rank',r);
 ind=randperm(prod(n)); ind=ind(1:m);
 A=sparse(1:m, ind, ones(1,m), m, prod(n));
 yy=A*w0(:)+0.01*randn(m,1);
 lambda=0.3*norm(reshape(A'*yy,n));
 [ww,stat]=dalsqds(zeros(n),A,yy,lambda,'solver','qn');

LASSO with individual weights

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 pp=0.1*abs(A'*bb);
 [ww,stat]=dalsqal1(zeros(n,1), A, bb, pp);

Individually weighted sqaured loss

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 lambda=0.1*max(abs(A'*bb));
 weight=(1:m)';
 [ww,stat]=dalsqwl1(zeros(n,1), A, bb, lambda, weight);
 figure, plot(bb-A*ww);

Elastic-net-regularized squared-loss regression

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 lambda=0.1*max(abs(A'*bb));
 [ww,stat]=dalsqen(zeros(n,1), A, bb, lambda, 0.5);

Sparsely-connected multivariate AR model

Try

 s_test_hsgl

To get

Estimation of sparsely-connected MVAR coefficients

Here the model is a sparsely connected 3rd order multi-variate AR model with 20 variables. The top row shows the true coefficients. The bottom row shows the estimated coefficients.

See s_test_hsgl.m and our paper for more details.

References

Papers that use DAL

Acknowledgments

I would like to thank Stefan Haufe, Christian Kothe, Marius Kloft, Artemy Kolchinsky, and Makoto Yamada for testing the software and pointing out some problems. The non-negative lasso extension was contributed by Shigeyuki Oba. The weighted lasso was contributed by Satoshi Hara. I was supported by the Global COE program (Computationism as a Foundation for the Sciences) until March 2009.

Something went wrong with that request. Please try again.