Proximal Operator Graph Solver
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
examples fixed example May 13, 2015
img eq Jul 16, 2014
matlab comments Aug 20, 2014
src readme fix Jun 5, 2015
README.md add license clause to readme Sep 17, 2017

README.md

POGS

Proximal Graph Solver (POGS) is a solver for convex optimization problems in graph form using Alternating Direction Method of Multipliers (ADMM).

  • The GitHub.io page contains Documentation.
  • We recently posted a paper online, with details about the implementation, as well as many numerical examples.

A graph form problem can be expressed as

minimize        f(y) + g(x)
subject to      y = Ax

where f and g are convex and can take on the values R \cup {∞}. The solver requires that the proximal operators of f and g are known and that f and g are separable, meaning that they can be written as

f(y) = sum_{i=1}^m f_i(y_i)
g(x) = sum_{i=1}^n g_i(x_i)

The following functions are currently supported

Supported Equations

where I(.) is the indicator function, taking on the value 0 if the condition is satisfied and infinity otherwise. More functions can be added by modifying the proximal operator header file: <pogs>/src/include/prox_lib.h.

Languages / Frameworks

Three different implementations of the solver are either planned or already supported:

  1. C++/BLAS/OpenMP: A CPU version can be found in the file <pogs>/src/cpu/. POGS must be linked to a BLAS library (such as the Apple Accelerate Framework or ATLAS).
  2. C++/cuBLAS/CUDA: A GPU version is located in the file <pogs>/src/gpu/. To use the GPU version, the CUDA SDK must be installed, and the computer must have a CUDA-capable GPU.
  3. MATLAB: A MATLAB implementation along with examples can be found in the <pogs>/matlab directory. The code is heavily documented and primarily intended for pedagogical purposes.

Problem Classes

Among others, the solver can be used for the following classes of (linearly constrained) problems

  • Lasso, Ridge Regression, Logistic Regression, Huber Fitting and Elastic Net Regulariation,
  • Total Variation Denoising, Optimal Control,
  • Linear Programs and Quadratic Programs.

References

  1. Parameter Selection and Pre-Conditioning for a Graph Form Solver -- C. Fougner and S. Boyd
  2. Block Splitting for Distributed Optimization -- N. Parikh and S. Boyd
  3. Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers -- S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein
  4. Proximal Algorithms -- N. Parikh and S. Boyd

License

POGS is licensed with BSD-3.

Authors

Chris Fougner, with input from Stephen Boyd. The basic algorithm is from "Block Splitting for Distributed Optimization -- N. Parikh and S. Boyd".