# matthieuheitz/WassersteinDictionaryLearning

Code for the article "Wasserstein Dictionary Learning", Schmitz et al. 2018
C++ C CMake
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
cpp
data
.gitignore

# Wasserstein Dictionary Learning [Schmitz et al. 2018]

This repository contains the code for the following publication. Please credit this reference if you use it.

@article{schmitz_wasserstein_2018,
title = {Wasserstein {Dictionary} {Learning}: {Optimal} {Transport}-based unsupervised non-linear dictionary learning},
shorttitle = {Wasserstein {Dictionary} {Learning}},
url = {https://hal.archives-ouvertes.fr/hal-01717943},
journal = {SIAM Journal on Imaging Sciences},
author = {Schmitz, Morgan A and Heitz, Matthieu and Bonneel, Nicolas and Ngolè Mboula, Fred Maurice and Coeurjolly, David and Cuturi, Marco and Peyré, Gabriel and Starck, Jean-Luc},
year = {2018},
keywords = {Dictionary Learning, Optimal Transport, Wasserstein barycenter},
}


The full text is available on HAL and arXiv.

### Configure, build and run

There is a CMakeLists.txt for the project, so you can just create a build directory outside the source.

$mkdir build$ cd build

##### Warm restart

The idea is that instead of a single L-BFGS run of 500 iterations, you restart a fresh L-BFGS every 10 iterations, and initialize the scaling vectors as the ones obtained at the end of the previous run. As explained in the paper, this technique accumulates the Sinkhorn iterations as we accumulate L-BFGS runs, so it allows to compute less Sinkhorn iterations for equivalent or better results, which leads to significant speed-ups. Be aware that accumulating too much Sinkhorn iterations can lead to numerical instabilities. If that happens, you can use the log-domain stabilization, which is slower, but compensated by the speed-up of the warm restart. For more details, please refer to our paper. The value of 10 optimization iterations per L-BFGS run is arbitrary and can be changed in the code (in regress_both() of inverseWasserstein.h), but it has shown good results for our experiments.