Skip to content

ryotat/dal

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 

DAL

Dual Augmented Lagrangian (DAL) algorithm for sparse/low-rank reconstruction and learning. For more details, check out our JMLR paper, book chapter, talk, and slides.

Examples

L1-regularized squared-loss regression (LASSO)

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 lambda=0.1*max(abs(A'*bb));
 [ww,stat]=dalsql1(zeros(n,1), A, bb, lambda);

L1-regularized logistic regression

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); yy=sign(A*w0+0.01*randn(m,1));
 lambda=0.1*max(abs(A'*yy));
 [ww,bias,stat]=dallrl1(zeros(n,1), 0, A, yy, lambda);

Grouped-L1-regularized logistic regression

 m = 1024; n = [64 64]; k = round(0.1*n(1)); A=randn(m,prod(n));
 w0=randsparse(n,k); yy=sign(A*w0(:)+0.01*randn(m,1));
 lambda=0.1*max(sqrt(sum(reshape(A'*yy/2,n).^2)));
 [ww,bias,stat]=dallrgl(zeros(n), 0, A, yy, lambda);

Trace-norm-regularized logistic regression

 m = 2048; n = [64 64]; r = round(0.1*n(1)); A=randn(m,prod(n));
 w0=randsparse(n,'rank',r); yy=sign(A*w0(:)+0.01*randn(m,1));
 lambda=0.2*norm(reshape(A'*yy/2,n));
 [ww,bias,stat]=dallrds(zeros(n), 0, A, yy, lambda);

Noisy matrix completion

 n = [640 640]; r = 6; m = 2*r*sum(n);
 w0=randsparse(n,'rank',r);
 ind=randperm(prod(n)); ind=ind(1:m);
 A=sparse(1:m, ind, ones(1,m), m, prod(n));
 yy=A*w0(:)+0.01*randn(m,1);
 lambda=0.3*norm(reshape(A'*yy,n));
 [ww,stat]=dalsqds(zeros(n),A,yy,lambda,'solver','qn');

LASSO with individual weights

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 pp=0.1*abs(A'*bb);
 [ww,stat]=dalsqal1(zeros(n,1), A, bb, pp);

Individually weighted sqaured loss

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 lambda=0.1*max(abs(A'*bb));
 weight=(1:m)';
 [ww,stat]=dalsqwl1(zeros(n,1), A, bb, lambda, weight);
 figure, plot(bb-A*ww);

Elastic-net-regularized squared-loss regression

 m = 1024; n = 4096; k = round(0.04*n); A=randn(m,n);
 w0=randsparse(n,k); bb=A*w0+0.01*randn(m,1);
 lambda=0.1*max(abs(A'*bb));
 [ww,stat]=dalsqen(zeros(n,1), A, bb, lambda, 0.5);

Sparsely-connected multivariate AR model

Try

 exp_hsgl

To get

Estimation of sparsely-connected MVAR coefficients

Here the model is a sparsely connected 3rd order multi-variate AR model with 20 variables. The top row shows the true coefficients. The bottom row shows the estimated coefficients.

See exp_hsgl.m and our paper for more details.

References

Papers that use DAL

Acknowledgments

I would like to thank Stefan Haufe, Christian Kothe, Marius Kloft, Artemy Kolchinsky, and Makoto Yamada for testing the software and pointing out some problems. The non-negative lasso extension was contributed by Shigeyuki Oba. The weighted lasso was contributed by Satoshi Hara. I was supported by the Global COE program (Computationism as a Foundation for the Sciences) until March 2009.

About

Dual Augmented Lagrangian (DAL) algorithm for sparse/low-rank reconstruction and learning

Resources

License

Stars

Watchers

Forks

Packages

No packages published