Skip to content
Branch: master
Go to file
Code

Latest commit

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 

README.md

nmf: FPA for NMF with the KL divergence

This package implements a gradient descent method for non-negative matrix factorization (NMF) with the Kullback-Leibler (KL) divergence. Because of the lack of smoothness of the KL loss, we use a first-order primal-dual algorithm (FPA) based on the Chambolle-Pock algorithm. We provide an efficient heuristic way to select step-sizes, and all required computations may be obtained in closed form.

References

Felipe Yanez, and Francis Bach. Primal-Dual Algorithms for Non-negative Matrix Factorization with the Kullback-Leibler Divergence, arXiv:1412.1788, 2014.

Felipe Yanez, and Francis Bach. Primal-Dual Algorithms for Non-negative Matrix Factorization with the Kullback-Leibler Divergence, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), New Orleans, LA, USA, 2017.

About

Primal-dual Algorithms for Non-negative Matrix Factorization with the Kullback-Leibler Divergence

Resources

License

Releases

No releases published

Languages

You can’t perform that action at this time.