GenS is a generalizable neural surface reconstruction method, which can recover high-frequency details while maintaining global smoothness through generalized multi-scale volumes. It leverage the mult-scale feature-metric consistency to further impose the multi-view consistency in the more robust feature space. Meanwhile, taking the more accurate geometry of the dense inputs to teach the model with sparse inputs further improve the performance. Details are described in our paper:
GenS: Generalizable Neural Surface Reconstruction from Multi-View Images
Rui Peng, Xiaodong Gu, Luyang Tang, Shihe Shen, Fanqi Yu, Ronggang Wang
NeuIPS 2023 (arxiv)
The reconstruction results on DTU dataset can be downloaded here.
Code will be released soon!