Skip to content

mktal/peregrine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Peregrine

A General Purpose Optimizer for Fast Machine Learning

by Xiaocheng Tang [https://mktal.github.io/]

This code implements in C/C++ a fast second-order sparse training algorithm that is shown to be order of magnitude faster than other first-order methods like (stochastic) gradient descent. The algorithm provides a more effective learning scheme through a sequence of quadratic approximations with Hessian information. This code can be easily extended to, i.e., distributed settings or training neural nets, with python libraries like Numpy, Apache Spark or TensoFlow. Please see examples for more details.

This project was presented in 2016 ICML workshop Optimization Methods for the Next Generation of Machine Learning.

Getting Started

How to run the code locally:

pip install ./peregrine/
# train sparse logistic regression
cd peregrine/peregrine/examples && python single_node.py

Citation

  • Katya Scheinberg and Xiaocheng Tang, Practical Inexact Proximal Quasi-Newton Method with Global Complexity Analysis, Mathematical Programming Series A, 160(1), 495–529., 2016.
@article{Scheinberg:2016wj,
  author = {Scheinberg, Katya and Tang, Xiaocheng},
  title = {{Practical inexact proximal quasi-Newton method with global complexity analysis}},
  journal = {Mathematical Programming},
  year = {2016},
  volume = {160},
  number = {1},
  pages = {495--529}
}

About

A General Purpose Optimizer for Fast Machine Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published