This repository is a PyTorch implementation of Correspondence Learning via Linearly-invariant Embedding.
This is not the code used to produce the paper results, which can be found Here. This implementation has been made to make handier the use of the method (and also to replicate it).
To install requirements:
pip install -r requirements.txt
Installing PyTorch may require an ad hoc procedure, depending on your computer settings.
You can download data and the pre-trained models using the scripts:
python .\data\download_data.py
python .\models\pretrained\download_pretrained.py
To train the basis and descriptors models, run these commands:
python .\code\train_basis.py
python .\code\train_desc.py
To evaluate the model on FAUST w\noise, run:
python .\code\test_faust.py
And in matlab the script:
.\evaluation\evaluation.m
These are the results of the two implementations:
Model name | Ours | Ours+Opt |
---|---|---|
TF 1.5 | 6.0e-2 | 2.9e-2 |
PyTorch | 5.7e-2 | 3.1e-2 |
The small discrepancies have several reasons:
- basis and descriptors networks are trained 400 epochs in PyTorch implementation; several thousand in TF 1.5
- while the two implementations are similar, there are some differences in the training process and hyperparameters due to libraries.
- the training requires pseudo-inverse computation; these can produce different results depending on the library
If you use this code, please cite our paper.
@article{marin2020correspondence,
title={Correspondence learning via linearly-invariant embedding},
author={Marin, Riccardo and Rakotosaona, Marie-Julie and Melzi, Simone and Ovsjanikov, Maks},
journal={Advances in Neural Information Processing Systems},
volume={33},
year={2020}
}
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. For any commercial uses or derivatives, please contact us.