Skip to content

keskarnitish/OBA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OBA

Author: Nitish Shirish Keskar

OBA is a second-order method for convex L1-regularized optimization with active-set prediction. OBA belongs to the family of Orthant-Based methods (such as OWL) and uses a selective-corrective mechanism which brings about increased efficiency and robustness.

Features

The OBA package

  • allows for solving general convex L1-regularized problems including Logistic Regression and LASSO.
  • is written in pure-MATLAB with minimal dependencies and emphasizes simplicity and cross-platform compatibility.
  • includes both Newton and quasi-Newton options for the proposed algorithm.

Usage Guide

The algorithm can be run using the syntax

x = OBA(funObj,lambda,[options]);

Here,

  • funObj is an object with member functions for computing the function, gradient and Hessian-vector products at the iterates. Logistic Regression and LASSO classes are provided with the package. The file funTemplate.m can be used as a base for designing a custom function.
  • lambda is the positive scalar for inducing sparsity in the solution.
  • options is an optional argument for changing the default parameters used in OBA. For ease of use, the user can generate the default options struct using options=GenOptions() and change the parameters therein before passing it to OBA.

The parameters and their default values are

  •  `options.optol`: termination tolerance
        (default: 1e-6)
    
  •  `options.qn`: Quasi-Newton, 0 (Newton's Method), or 1 (quasi-Newton)
        (default: 0)
    
  •  `options.mem_size`: quasi-Newton memory size
        (default: 20)
    
  •  `options.maxiter`: max number of iterations
        (default: 1000)
    
  •  `options.printlev`: print level, 0 (no printing) or 1
        (default: 1)
    
  •  `options.CGtol`: CG termination tolerance (for Newton's Method)
        (default: 1e-1)
    
  •  `options.maxCGiter`: max number of CG iterations (Newton's Method)
        (default: 1000).
    

For a detailed documentation of OBA and its associated functions, use help OBA.

Citation

If you use OBA for your research, please cite the paper

@article{OBA_Keskar2016,
author = {N. Keskar and J. Nocedal and F. Öztoprak and A. Wächter},
title = {A second-order method for convex -regularized optimization with active-set prediction},
journal = {Optimization Methods and Software},
volume = {0},
number = {0},
pages = {1-17},
year = {0},
doi = {10.1080/10556788.2016.1138222},
URL = {http://dx.doi.org/10.1080/10556788.2016.1138222},
eprint = {http://dx.doi.org/10.1080/10556788.2016.1138222}
}

About

A Second-Order Method for Convex L1-Regularized Optimization with Active-Set Prediction

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages