Skip to content
forked from frozoul/4K-NeRF

Official implementation of arxiv paper "4K-NeRF: High Fidelity Neural Radiance Fields at Ultra High Resolutions"

Notifications You must be signed in to change notification settings

JaeDukSeo/4K-NeRF

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

4K-NeRF: High Fidelity Neural Radiance Fields at Ultra High Resolutions

Zhongshu Wang, Lingzhi Li, Zhen Shen, Li Shen, Liefeng Bo

Alibaba Group

4K-Results

Due to the limitation of video size displayed on the webpage, the following videos are down-sampled. For the original 4K video comparison, please check under the 4K_results folder.

fern_compare_1k.mp4
horn_compare_1k.mp4

Setup

Dependencies

pip install -r requirements.txt

Pytorch and torch_scatter installation is machine dependent, please install the correct version for your machine.

Datasets

nerf_llff_data is the mainly used dataset as it is the only dataset that has 4K resolution images.

nerf_synthetic was used in some ablation studys.

Put them in the ./datasets sub-folder.

Pre Trained Model

We partially initialize VC-Decoder with a pretrained model to speedup convergence. Put the downloaded pretrained weight in ./pretrained sub-folder. Note that 4K-NeRF can be trained without pretrained weights.

Directory-structure:

4K-NeRF
│ 
│
├──pretrained
│   └──RealESRNet_x4plus.pth
│ 
│ 
└── datasets
    ├── nerf_synthetic     # Link: https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1
    │   └── [chair|drums|ficus|hotdog|lego|materials|mic|ship]
    │       ├── [train|val|test]
    │       │   └── r_*.png
    │       └── transforms_[train|val|test].json
    │
    │
    └── nerf_llff_data     # Link: https://drive.google.com/drive/folders/128yBriW1IG_3NJ5Rp7APSTZsJqdJdfc1
        └── [fern|flower|fortress|horns|leaves|orchids|room|trex]

Training

Our method can train from scratch for any given scene, but we recommend pre-train the VC-Encoder for faster convergence:

python run.py --config configs/llff/fern_lg_pretrain.py --render_test

After the pre-training, we use following commands to train the full 4K-NeRF with different configs:

  • traing 4K resolution with L1 loss:

    python run_sr.py --config configs/llff/fern_lg_joint_l1.py --render_test --ftdv_path logs/llff/pretrain_fern_l1/fine_last.tar --ftsr_path ./pretrained/RealESRNet_x4plus.pth

  • traing 4K resolution with L1+GAN loss:

    python run_sr.py --config configs/llff/fern_lg_joint_l1+gan.py --render_test --ftdv_path logs/llff/pretrain_fern_l1/fine_last.tar --ftsr_path ./pretrained/RealESRNet_x4plus.pth

  • traing 1K resolution with L1+GAN loss:

    python run_sr.py --config configs/llff/1x_fern_lg_joint_l1+gan.py --render_test --ftdv_path logs/llff/pretrain_fern_l1/fine_last.tar --ftsr_path ./pretrained/RealESRNet_x4plus.pth

Evaluation

Evaluate at 4K resolution:

python run_sr.py --config configs/fern_lg_joint_l1+gan.py --render_test --render_only --dv_path logs/llff/<eval_dir>/render_val/lpips_dvgo.tar --sr_path logs/llff/<eval_dir>/render_val/sresrnet_latest.pth

Replace the <eval_dir> to the corresponding experiment name.

Reference

About

Official implementation of arxiv paper "4K-NeRF: High Fidelity Neural Radiance Fields at Ultra High Resolutions"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 73.3%
  • Cuda 20.2%
  • C++ 6.2%
  • C 0.3%