Skip to content

y-u-a-n-l-i/Climate_NeRF

Repository files navigation

ClimateNeRF: Extreme Weather Synthesis in Neural Radiance Field

This is the official repo for PyTorch implementation of paper "ClimateNeRF: Extreme Weather Synthesis in Neural Radiance Field", ICCV 2023.

teaser.mp4

🌦️ Prerequisites

This project is tested on:

  • Ubuntu 18.04.6:
    • CUDA 11.3
  • NVIDIA RTX 3090
  • Python package manager conda

🌦️ Setup

Environment

  • First clone this repository git clone --recursive https://github.com/y-u-a-n-l-i/Climate_NeRF.git
  • Create and activate environment by conda create -n climatenerf python=3.8 and conda activate climatenerf.
  • Install torch and torchvision by pip install torch==1.11.0 torchvision==0.12.0 --extra-index-url https://download.pytorch.org/whl/cu113
  • Install torch-scatter by pip install torch-scatter -f https://data.pyg.org/whl/torch-1.11.0+cu113.html
  • Install PyTorch extension from tinycudann.
    • Installing with float32 precision according to #51 is recommended.
  • Install mmsegmentation and download config and checkpoint files according to their instruction.
    • segmentation model segformer_mit-b5_8xb1-160k_cityscapes-1024x1024 is recommended.
  • Install dependences of shadow predictor MTMT by pip3 install --no-build-isolation git+https://github.com/lucasb-eyer/pydensecrf.git.
    • Download checkpoint of MTMT from official repo.
  • Install remaining dependencies by pip install -r requirements.txt
  • Install cuda extension with pip install models/csrc, pip >= 22.1 is needed. Recompile CUDA extension after any modifications.
potential bugs
  1. Bug: when installing tinycudann
...
{PATH_TO}/tiny-cuda-nn/dependencies/json/json.hpp:3954:14: fatal error: filesystem: No such file or directory
    #include <filesystem>
            ^~~~~~~~~~~~

Solution in NVlabs/tiny-cuda-nn#352 is recommended. If CUDA 11.3 is used, gcc-9 will be recommended.

  1. Bug: when installing pydensecrf
'MatrixXf' is not a type identifier

Solution in lucasb-eyer/pydensecrf#123 (comment) is recommended.

  1. Bug: when using shadow predictor:
No such file or directory: '/media/data/chenzhihao/code/MTMT/backbone_pth/resnext_101_32x4d.pth'

Download resnext model from this link provided by MTMT and changing resnext_101_32_path in datasets/shadow_tools/MTMT/networks/resnext/config.py into where you put resnext's checkpoint.

Dataset

  1. TanksAndTemple dataset:

We use the download link from ARF. Download and extract by:

pip install gdown
gdown 10Tj-0uh_zIIXf0FZ6vT7_te90VsDnfCU
unzip TanksAndTempleBG.zip && mv TanksAndTempleBG tnt
  1. Kitti-360 dataset:

We use the download link from Panoptic NeRF. We use the same data folder sructure as the one in Panoptic NeRF.

  1. Colmap dataset:

We mainly test our project on garden scene in mipnerf360 dataset.

Train

Scene reconstruction with semantic predictions.

To train our model with semantic predictions, users need to set render_semantic in config files to be True. Moreover, users need to set $SEM_CONF and $SEM_CKPT to where they put semantic config file and predictor's checkpoint which downloaded from mmsegmentation.

Model Parameters

  • Dowload plane parameters here, which are used in flood simulation. Please put the scene-specific plane.npy in the folder of dataset (e.g. TanksAndTempleBG/Playground/plane.npy)
  • In order to estimate plane parameters for new scenes, please run the following script:
python -m utility.vanishing_point --dataset <DATASET_TYPE> -root_dir <DATA_ROOT> -output plane.npy

🌦️Usage

The configuration of each scene could be adjusted in the config files under configs/, and we provide partial training/rendering/simulation scripts under scripts/.

In the following we use TanksAndTemple Playground scene as example, please edit the paths, experiment names accordingly. You can also run all the following together with bash scripts/tanks/playground.sh, and the output images and videos are under results/.

Train

DATA_ROOT=${PATH_TO}/TanksAndTempleBG/Playground
SEM_CONF=${PATH_TO}/mmsegmentation/ckpts/segformer_mit-b5_8xb1-160k_cityscapes-1024x1024.py
SEM_CKPT=${PATH_TO}/mmsegmentation/ckpts/segformer_mit-b5_8x1_1024x1024_160k_cityscapes_20211206_072934-87a052ec.pth

python train.py --config configs/Playground.txt --exp_name playground \
    --root_dir $DATA_ROOT --sem_conf_path $SEM_CONF --sem_ckpt_path $SEM_CKPT

Novel View Synthesis

python render.py --config configs/Playground.txt --exp_name playground \
    --root_dir $DATA_ROOT \
    --weight_path $CKPT \
    --render_depth --render_depth_raw --render_normal --render_semantic

Stylize

python stylize.py --config configs/Playground.txt \
    --weight_path $CKPT --num_epochs 10

🌫️ Smog Simulation

python render.py --config configs/Playground.txt --exp_name playground-smog \
    --root_dir $DATA_ROOT \
    --weight_path $CKPT \
    --simulate smog --chunk_size -1 

🌊 Flood Simulation

python render.py --config configs/Playground.txt --exp_name playground-flood \
    --root_dir $DATA_ROOT \
    --weight_path $CKPT \
    --simulate water \
    --plane_path $DATA_ROOT/plane.npy \
    --anti_aliasing_factor 2 --chunk_size 600000

❄️ Snow Simulation

First make snow by:

python make_snow.py --config configs/Playground.txt --exp_name playground-snow \
    --weight_path $STYLIZED_CKPT \
    --weight_path_origin_scene $CKPT \
    --mb_size 5.e-3 --num_epochs 20
  • enable --shadow_hint flag if there are strong shadows in the scene, e.g. kitti dataset.
  • Set $STYLIZED_CKPT to the checkpoint generated by stylize.py or the same as $CKPT if no stylization needed.

Set $CKPT_SNOW to the checkpoint generated by make_snow.py and render

python render.py --config configs/Playground.txt \
    --weight_path $CKPT_SNOW \
    --simulate snow --exp_name playground-snow --chunk_size 65535 --mb_size 5.e-3

Acknowledgement

The code was built on ngp_pl. Thanks kwea123 for the great project!

About

This is the official repo for PyTorch implementation of paper "ClimateNeRF: Extreme Weather Synthesis in Neural Radiance Field", ICCV 2023.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published