-
Notifications
You must be signed in to change notification settings - Fork 13
go2chayan/LASSO_Using_PGD
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
README ====== I did it as a part of homework problem in the Advanced Machine Learning class taught by Prof Ji Liu in Fall 2014. (http://www.cs.rochester.edu/u/jliu/index.html) The python codes show the use of Proximal Gradient Descent and Accelerated Proximal Gradient Descent algorithms for solving LASSO formulation of optimization: LASSO: \min_x f(x):= \frac{1}{2}|Ax-b|^2 + \lambda|x|_1 LASSO formulation can reconstruct original data from its noisy version by using the sparsity constraint. The current code takes a sparse vector (x*), applies a random linear transformation (i.e. multiply with a random matrix, A), and adds noise with it. It then takes the noisy vector and the transformation matrix and reconstracts the original sparse vector. The accelerated version demonstrates that it converges much faster than the normal version.
About
A demo showing how proximal gradient descent and accelerated proximal gradient descent can solve LASSO formulation
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published