Skip to content

A demo showing how proximal gradient descent and accelerated proximal gradient descent can solve LASSO formulation

Notifications You must be signed in to change notification settings

go2chayan/LASSO_Using_PGD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

README
======

I did it as a part of homework problem in the Advanced Machine Learning class taught by Prof Ji Liu in Fall 2014. 
(http://www.cs.rochester.edu/u/jliu/index.html)

The python codes show the use of Proximal Gradient Descent and Accelerated Proximal Gradient Descent algorithms
for solving LASSO formulation of optimization:

LASSO: \min_x f(x):= \frac{1}{2}|Ax-b|^2 + \lambda|x|_1

LASSO formulation can reconstruct original data from its noisy version by using the sparsity constraint.

The current code takes a sparse vector (x*), applies a random linear transformation (i.e. multiply with a random matrix, A), and adds noise with it. It then takes the noisy vector and the transformation matrix and reconstracts the original sparse vector.

The accelerated version demonstrates that it converges much faster than the normal version.

About

A demo showing how proximal gradient descent and accelerated proximal gradient descent can solve LASSO formulation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages