This is a python wrapper for the (Decomposable) Bregman Approximate Nearest Neighbour(Bregman ANN, or BANN) package adapted by Hubert Wagner and Tuyen Pham. The original ANN package was written by David Mount and Sunil Arya. The original package uses Kd-trees to search for nearest neighbours in Euclidean space equipped with an
For a Python wrapped version of the original ANN, see:
Bregman divergences are measurements of generalized distances in a space. Unlike metrics, they are often assymmetric and do not globally satisfy the triangle inequality. Recently, these divergences have been useful in machine learning, with the most prominent example being the Kullback--Leibler divergence.
The BANN package currently uses Kd-trees for two primary computations:
- (approximate)
$k$ -nearest neighbour searches with decomposable Bregman divergences - Bregman--Hausdorff divergence for decomposable Bregman divergences
Currently, this package supports the following divergences:
- Kullback--Leibler divergence (primal and dual)
- Itakura--Saito divergence (primal and dual)
- Squared Euclidean divergence
Additional decomposable divergences can be simply added to the source code, and passing a divergence from Python is a planned implementation.
Let
For
This package also supports
Further details of using Kd-trees with Bregman Divergences are discussed here.
The Bregman—Hausdorff divergence generalizes the Bregman divergence between two vectors to the Bregman divergence between to sets of vectors. The Bregman—Hausdorff divergence was introduced by Pham, Dal Poz Kouřimská, and Wagner, where they also provide algorithms for its computation. Specifically, we compute
The Bregman—Hausdorff divergence and shell algorithm for computation are introduced here.
For further details and example uses, see the documentation.
BANN requires Python >=3.11.
Bug reports, pull requests after forking, and other questions may be sent to the maintainer: tuyen.pham@ufl.edu
See Copyright and License for copyright and license information.