Skip to content

ruc-aimc-lab/GeoFormer

Repository files navigation

GeoFormer for Homography Estimation

This is the official source code of our ICCV2023 paper: Geometrized Transformer for Self-Supervised Homography Estimation.

illustration

Environment

We used Anaconda to setup a deep learning workspace that supports PyTorch. Run the following script to install all the required packages.

conda create -n GeoFormer python==3.8 -y
conda activate GeoFormer
git clone https://github.com/ruc-aimc-lab/GeoFormer.git
cd GeoFormer
pip install -r requirements.txt

Downloads

Data

For Training

GeoFormer can be trained on the artificially synthesized dataset OxFord-Paris, as well as on Megadepth dataset with depth labels included. You need to organize the data according to the specified format.

  • OxFord-Paris, you can directly download the OxFord 5K and Paris 6K and extract them into one folder and move to the 'data/' directory;
  • Megadepth, you need to follow the process outlined in the LoFTR project to organize it from scratch. The training data should be organized as follows.
data/
    datasets/
        Oxford-Paris/
            oxbuild_images/ 
                all_souls_000000.jpg
                ....
            paris/
                defense/
                eiffle/
                ....  
        Megadepth/
            index/
            train/
            test/

For Testing

The file organizations are as follows:

data/
    datasets/
        FIRE/
            Ground Truth/
            Images/
            Masks/
        
        hpatches-sequences-release/
            i_ajuntament/
            ....
            
        ISC-HE/
            gd/
            query/
            refer/
            index.txt
  • Note that the annotation file of control_points_P37_1_2.txt in FIRE dataset is incorrect, so it shall be excluded from evaluation.

Models

You may skip the training stage and use our provided models for homography estimation.

Put the trained model geoformer.ckpt into saved_ckpt/ folder.

The model config can be found in model/geo_config.py

Code

Training

Before training GeoFormer, the parameters of the dataset can be modified in these two files. train_config/homo_trainval_640.py file or train_config/megadepth_trainval_640.py file And the training hyperparameters are adjusted here train_config/loftr_ds_dense.py.

python -m lightning/train_depth_geoformer
or
python -m lightning/train_homo_geoformer

Inference

Homography Estimation Performance

Our evaluation code is implemented based on the foundation of this benchmark.

The eval_Hpatches.py code shows how homography estimation is performed on the Hpatches dataset.

python eval_Hpatches.py

If everything goes well, you shall see the following message on your screen:

==== Homography Estimation ====
Hest solver=cv est_failed=0 ransac_thres=3 inlier_rate=0.89
Hest Correct: a=[0.7    0.8845 0.9379 0.9603]
i=[0.8772 0.986  0.9965 0.9965]
v=[0.5288 0.7864 0.8814 0.9254]
Hest AUC: a=[0.5154 0.7206 0.7997 0.8768]
i=[0.7634 0.8913 0.9319 0.9642]
v=[0.2774 0.5572 0.6735 0.7931]

The other test sets can be evaluated similarly.

python eval_FIRE.py

python eval_ISC.py

One Pair Inference

If you just want to input one pair and obtain the matching results, you can just run the infer code inference.py.

Citations

If you find this repository useful, please consider citing:

@inproceedings{liu2022SuperRetina,
  title={Geometrized Transformer for Self-Supervised Homography Estimation},
  author={Jiazhen Liu and Xirong Li},
  booktitle={ICCV},
  year={2023}
}

Contact

If you encounter any issue when running the code, please feel free to reach us either by creating a new issue in the GitHub or by emailing

About

GeoFormer for Homography Estimation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages