transfer_city.mp4
The repository contains the official implementation of source code and pre-trained models of our paper:"Urban Architect: Steerable 3D Urban Scene Generation with Layout Prior". It is a novel pipeline for large-scale 3D urban scene generation!
- 2024.04.26: We release samples of layout in
dataset/data
and pretrained ControlNet weights! - 2024.04.11: The 🔥🔥🔥 pre-print 🔥🔥🔥 is released! Refer to it for more details!
- 2024.04.10: The project page is created. Check it out for an overview of our work!
- Main requirements:
- Download the pretrained model weights of Stable Diffusion
- Download the pretrained model weights of CLIP
- Download the pretrained model weights of ControlNet
- Other requirements are provided in
requirements.txt
Please refer to train.sh
.
Please refer to refine.sh
.
Please refer to render.sh
.
- Release 3D layout data
- Technical Report
- Project page
- (ICCV 2023) Urban Radiance Field Representation with Deformable Neural Mesh Primitives, Fan Lu et al. [Paper], [Project Page]
If you find this project useful for your work, please consider citing:
@article{lu2024urban,
title={Urban Architect: Steerable 3D Urban Scene Generation with Layout Prior},
author={Lu, Fan and Lin, Kwan-Yee and Xu, Yan and Li, Hongsheng and Chen, Guang and Jiang, Changjun},
journal={arXiv preprint arXiv:2404.06780},
year={2024}
}