Skip to content

EnVision-Research/Ref-NeuS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ref-NeuS: Ambiguity-Reduced Neural Implicit Surface Learning for Multi-View Reconstruction with Reflection (ICCV, Oral, Best Paper Nomination. Top 0.2%)

This is the official repo for the implementation of Ref-NeuS: Ambiguity-Reduced Neural Implicit Surface Learning for Multi-View Reconstruction with Reflection, Wenhang Ge, Tao Hu, Haoyu Zhao, Shu Liu, Ying-Cong Chen.

Setup

Installation

This code is built with pytorch 1.11.0. See requirements.txt for the python packages.

You can create an anaconda environment called refneus with the required dependencies by running:

conda create -n refneus python=3.7 
conda activate refneus  
conda install pytorch==1.11.0 torchvision==0.12.0 cudatoolkit=11.3 -c pytorch
pip install -r requirements.txt

Data

Download data ShinlyBlender.

Download the GT dense point cloud for evaluation from Google Drive.

Make sure the data is organized as follows (we show an object helmet here):

+-- ShinyBlender
|   +-- helmet
|       +-- test
|       +-- train
|       +-- dense_pcd.ply
|       +-- points_of_interest.ply
|       +-- test_info.json
|       +-- transforms_test.json
|       +-- transforms_train.json
    +-- toaster

Evaluation with pretrained model

Download the pretrained models Pretrained Models for reconstruction evaluation, Pretrained Models for PSNR evaluation.

Run the evaluation script with

python exp_runner.py --mode validate_mesh --conf ./confs/womask.conf --ckpt_path ckpt_path

ckpt_path is the path to the pretrained model.

Make sure the data_dir in configuration file ./confs/womask.conf points to the same object as pretrained model.

The output mesh will be in base_exp_dir/meshes. You can specify the path base_exp_dir in the configuration file.

The evaluaton metrics will be written in base_exp_dir/result.txt.

The error visulization are in base_exp_dir/vis_d2s.ply. Points with large errors are marked in red.

We can also download our final meshes results here.

We also provide a function to make a video for surface normals and novel view synthesis. Run the evaluation script with

python exp_runner.py --mode visualize_video --conf ./confs/womask.conf --ckpt_path ckpt_path

The output videos will be in base_exp_dir/normals.mp4 and base_exp_dir/video.mp4.

Train a model from scratch

Run the evaluation script with

python exp_runner.py --mode train --conf ./confs/womask.conf

Citation

If you find our work useful in your research, please consider citing:

@article{ge2023ref,
  title={Ref-NeuS: Ambiguity-Reduced Neural Implicit Surface Learning for Multi-View Reconstruction with Reflection},
  author={Ge, Wenhang and Hu, Tao and Zhao, Haoyu and Liu, Shu and Chen, Ying-Cong},
  journal={arXiv preprint arXiv:2303.10840},
  year={2023}
}

Acknowledgments

Our code is partially based on NeuS project and some code snippets are borrowed from NeuralWarp. Thanks for these great projects.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages