Skip to content
master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 

README.md

Peacewise-Linear Bregman Divergence Learning (PBDL)

  1. Prerequisites:

a) Matlab (We used 2019a though it shoud be compatible with other versions)

b) Gurobi 9.0

  1. Quick-start pairwise dimilarity

to test the code, we have provided a demo Matlab script "example.m", which runs PBDL for pairwise similarity comparisions on Iris data-set. Using the learned Bregman divergence, the code will print out performance metrics for, clustering, K-nn or similarity ranking, based on the task you choose. If everything is working correctly, you should first see tuning for hyperparameters results. Then you should see :

-.-.-.-.-.-. Test Performance .-.-.-.-.-.-.-

Rand Index = 98.7 +- 1.3

Purity = 99.0 +- 1.0

K-NN Accuracy = 99.0 +- 1.0

Area under the curve = 99.0 +- 0.8

Average Precision = 98.3 +- 1.5

Similarity comparisions are generated as follows: 2000 random pairs from similar class and 2000 random pairs from different classes. The core file which does the optimziation is "PBDL_core" You can change the method or data-set or other experiment settings as guided in the code.

  1. Quick-start regression

To test the code, we have provided a demo Matlab script "example_regression.m", which runs PBDL for regression with synthetic data. If everything is working properly you should see a plot of the regression error which reaches around 0.05 with the full 100 point data-set. The code file which does the optimization is "PBR.m" You can change the method to "Mahalanobis regression" as guided in the code. you can also change the data-set or other experiment settings.

  1. Other files data folder: data-sets that we used on our paper "Learning Bregman Divergences" results: regression and pairwise comparision experiment results of our paper figure: figures used in our paper

  2. Help

For any problems please contact us at: ali.siahkamari@gmail.com We'll be happy to resolve your issue and improve our code. We suggest you to read our paper "Learning to Approximate Bregman Divergences".

  1. Notice

You can choose different methods in "example.m" by simply uncommenting them. These methods include, ITML, Kernelized NCA, GMML and Euclidean

About

Code for Learning Bregman Divergences

Resources

License

Releases

No releases published

Packages

No packages published

Languages

You can’t perform that action at this time.