Skip to content
View Decorate3D's full-sized avatar

Block or report Decorate3D

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Decorate3D/README.md

Decorate3D: Text-Driven High-Quality Texture Generation for Mesh Decoration in the Wild

Introduction

This paper presents Decorate3D, a versatile and user-friendly method for the creation and editing of 3D objects using images. Decorate3D models a real-world object of interest by neural radiance field (NeRF) and decomposes the NeRF representation into an explicit mesh representation, a view-dependent texture, and a diffuse UV texture. Subsequently, users can either manually edit the UV or provide a prompt for the automatic generation of a new 3D-consistent texture. To achieve high-quality 3D texture generation, we propose a structure-aware score distillation sampling method to optimize a neural UV texture based on user-defined text and empower an image diffusion model with 3D-consistent generation capability. Furthermore, we introduce a few-view resampling training method and utilize a super-resolution model to obtain refined high-resolution UV textures (2048$\times$2048) for 3D texturing. Extensive experiments collectively validate the superior performance of Decorate3D in retexturing real-world 3D objects. Project page: https://decorate3d.github.io/Decorate3D/.

Dataset

The collected datasets for our experiments can be accessed via the link Google Driver. It consists of images, mesh and UV texture from 14 real-world objects.

Results

We provide some sample results in the path ``docs/samples/'', which include 4 textured mesh, and 360-view videos shown in our project page.

Code

It is easy to reproduce the paper. Here are some references that may help.

  • 3D scene reconstruction with NeRF (extracting Mesh & UV): NeuS
  • Depth condition diffusion model: Diffusion
  • SDS sample code for 3D: Threestudio

For more details, please refer to the paper.

Citation

@inproceedings{guo2023decorate3d,
    title={Decorate3D: Text-Driven High-Quality Texture Generation for Mesh Decoration in the Wild},
    author={Guo, Yanhui and Zuo, Xinxin and Dai, Peng and Lu, Juwei and Wu, Xiaolin and Cheng, Li and Yan, Youliang and Xu, Songcen and Wu, Xiaofei},
    booktitle={Thirty-seventh Conference on Neural Information Processing Systems (NeurIPS)},
    year={2023},
}

Popular repositories Loading

  1. Decorate3D Decorate3D Public

    7