Skip to content
This repository has been archived by the owner on Feb 8, 2023. It is now read-only.


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

X Resolution Correspondence Networks


Please checkout our paper and supplementary material here.


If you consider using our code/data please consider citing us as follows:

    title={{$\mathbb{X}$}Resolution Correspondence Networks}, 
    author={Tinchev, Georgi and Li, Shuda and Han, Kai and Mitchell, David and Kouskouridas, Rigas}, 
    booktitle={Proceedings of British Machine Vision Conference (BMVC)},


  1. Install conda

  2. Run:

conda env create --name <environment_name> --file asset/xrcnet.txt

To activate the environment, run

conda activate xrcnet

Preparing data

We train our model on MegaDepth dataset. To prepare for the data, you need to download the MegaDepth SfM models from the MegaDepth website and download training_pairs.txt from here and validation_pairs.txt from here.


  1. After downloading the training data, edit the config/ file to specify the dataset location and path to validation and training pairs.txt file that you downloaded from above
  2. Run:
cd config;
bash -g <gpu_id> -c configs/xrcnet.json

Pre-trained model

We also provide our pre-trained model. You can download xrcnet.pth.tar from here and place it under the directory trained_models.

Evaluation on HPatches

The dataset can be downloaded from HPatches repo. You need to download HPatches full sequences.
After downloading the dataset, then:

  1. Browse to HPatches/
  2. Run python --checkpoint path/to/model --root path/to/parent/directory/of/hpatches_sequences. This will generate a text file which stores the result in current directory.
  3. Open Change relevent path accordingly and run the script to draw the result.

We provide results of XRCNet alongside with other baseline methods in directory cache-top.

Evaluation on InLoc

In order to run the InLoc evaluation, you first need to clone the InLoc demo repo, and download and compile all the required depedencies. Then:

  1. Browse to inloc/.
  2. Run python adjusting the checkpoint and experiment name. This will generate a series of matches files in the inloc/matches/ directory that then need to be fed to the InLoc evaluation Matlab code.
  3. Modify the inloc/eval_inloc_compute_poses.m file provided to indicate the path of the InLoc demo repo, and the name of the experiment (the particular directory name inside inloc/matches/), and run it using Matlab.
  4. Use the inloc/eval_inloc_generate_plot.m file to plot the results from shortlist file generated in the previous stage: /your_path_to/InLoc_demo_old/experiment_name/shortlist_densePV.mat. Precomputed shortlist files are provided in inloc/shortlist.

Evaluation on Aachen Day-Night

In order to run the Aachen Day-Night evaluation, you first need to clone the Visualization benchmark repo, and download and compile all the required depedencies (note that you'll need to compile Colmap if you have not done so yet). Then:

  1. Browse to aachen_day_and_night/.
  2. Run python adjusting the checkpoint and experiment name.
  3. Copy the file to visuallocalizationbenchmark/local_feature_evaluation and run it in the following way:
	--dataset_path /path_to_aachen/aachen 
	--colmap_path /local/colmap/build/src/exe
	--method_name experiment_name
  1. Upload the file /path_to_aachen/aachen/Aachen_eval_[experiment_name].txt to to get the results on this benchmark.


Our code is based on the code provided by DualRCNet, NCNet, Sparse-NCNet, and ANC-Net.


A novel dense correspondence network






No releases published


No packages published