Skip to content

Code for "Octree Guided Unoriented Surface Reconstruction" (CVPR2023)

License

Notifications You must be signed in to change notification settings

Chumbyte/OG-INR

Repository files navigation

Octree Guided Unoriented Surface Reconstruction (CVPR 2023)

Created by Chamin Hewa Koneputugodage, Yizhak Ben-Shabat (Itzik) and Stephen Gould from ANU and Technion.

Project page / Paper / Video

DiGS intuition

Introduction

This is the code for OG-INR (Octree-Guided Implicit Neural Representations).

Please follow the installation instructions below.

Instructions

1. Requirements

Our codebase uses PyTorch.

The code was tested with Python 3.7.9, torch 1.9.0, CUDA 11.3 on Ubuntu 18.04 (should work with later versions). For a full list of requirements see the requirement.txt file. Note we also use plotly-orca for visualisation, which needs to be installed from conda.

Example installation code (should install PyTorch separately):

conda create -n oginr python=3.7.9
conda activate oginr
conda install pip # for using pip commands in the conda env
pip install -r requirements.txt

pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113

2. Build Octree Code

cd octree_base
python setup.py build_ext --inplace

3. Surface Reconstruction (and Scene Reconstruction)

3.1 Data for Surface Reconstruction

3.1.1 Surface Reconstruction Benchamark data

The Surface Reconstruction Benchmark (SRB) data is provided in the Deep Geometric Prior repository. This can be downloaded via terminal into the data directory by running data/scripts/download_srb.sh (1.12GB download). We use the entire dataset (of 5 complex shapes).

If you use this data in your research, make sure to cite the Deep Geometric Prior paper.

3.1.2 ShapeNet data

We use a subset of the ShapeNet data as chosen by Neural Splines. This data is first preprocessed to be watertight as per the pipeline in the Occupancy Networks repository, who provide both the pipleline and the entire preprocessed dataset (73.4GB).

The Neural Spline split uses the first 20 shapes from the test set of 13 shape classes from ShapeNet. We provide a subset of the ShapeNet preprocessed data (the subset that corresponds to the split of Neural Splines) and the resulting point clouds for that subset. These can be downloaded via terminal into the data directory by running data/scripts/download_shapenet.sh (783.76MB download).

If you use this data in your research, make sure to cite the ShapeNet and Occupancy Network papers, and if you report on this split, compare and cite to the Neural Spline paper.

3.2 Running Surface Reconstruction

To train, test and evaluate on SRB run

./scripts/srb.sh

Similarly we provide a script for ShapeNet:

./scripts/shapenet.sh

These scripts have bash variables for changing the input, major hyperparameters, and where saves/logs/meshes are made.

Thanks

This is heavily based of our DiGS codebase. We also use trimesh and open3d for manipulating 3D data, cython for efficiently implementing the Octree code, pyvista for an interactive view of the octree, pyrallis for easy configs and arguments, and wandb for optional logging. After the paper was published we discovered that point-cloud-utils has excellent bindings for very fast approximate nearest neighbours (nanoflann) which makes part of the code much faster.

Thanks to @bearprin for helping with installation issues.

This work has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 893465. S. Gould is a recipient of an ARC Future Fellowship (proj. no. LP200100421) funded by the Australian Government.

License and Citation

If you find our work useful in your research, please cite our paper:

@inproceedings{koneputugodage2023octree,
        title={Octree Guided Unoriented Surface Reconstruction},
        author={Koneputugodage, Chamin Hewa and Ben-Shabat, Yizhak and Gould, Stephen},
        booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
        pages={16717--16726},
        year={2023}
      }

See LICENSE file.

About

Code for "Octree Guided Unoriented Surface Reconstruction" (CVPR2023)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published