This repository provides efficient implementations of Maximum Mean Discrepancies (aka. kernel norms), Hausdorff and Sinkhorn divergences between sampled measures. Thanks to the KeOps library, our routines scale up to batches of 1,000,000 samples, without memory overflows.
N.B.: As of today, KeOps is still in beta. The 0.1 version will be released on pip by the end of October, including a new documentation, Windows support and a bug fix for high-dimensional vectors.
Information on the subject is available in our papers:
Global divergences between measures : from Hausdorff distance to Optimal Transport, Jean Feydy, Alain Trouvé, ShapeMI2018.
Interpolating between Optimal Transport and MMD using Sinkhorn Divergences, Jean Feydy, Thibault Séjourné, François-Xavier Vialard, Shun-ichi Amari, Alain Trouvé, Gabriel Peyré.
Sinkhorn entropies and divergences, Jean Feydy, Thibault Séjourné, François-Xavier Vialard, Shun-ichi Amari, Alain Trouvé, Gabriel Peyré; (our long reference paper, available soon).
First and foremost, this repo is about providing a reference implementation of Sinkhorn-related divergences. In
/common/, you will find
a simple and
an efficient implementation of
the Sinkhorn algorithm.
will let you reproduce the figures of our ShapeMI paper (Miccai 2018 workshop),
sinkhorn_entropies contains those
of our reference theoretical article.