Skip to content

hangg7/nerfview

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nerfview

core-tests codecov

nerfview is a minimal* web viewer for interactive NeRF rendering. It is largely inspired by nerfstudio's viewer, but with a standalone packaging and simple API to quickly integrate into your own research projects.

*The whole package contains two files and is less than 400 lines of code.

Installation

This package requires Python 3.8+.

For existing project, you can install it via pip:

pip install nerfview

To run our examples, you can clone this repository and then install it locally:

git clone https://github.com/hangg7/nerfview
# Install torch first.
pip install torch
# Then this repo and dependencies for running examples. Note that `gsplat`
# requires compilation and this will take some time for the first time.
pip install -e .
pip install -r examples/requirements.txt

Usage

nerfview is built on viser and provides a simple API for interactive viewing.

The canonical usage is as follows:

from typing import Tuple

import viser
import nerfview


def render_fn(
    camera_state: nerfview.CameraState, img_wh: Tuple[int, int]
) -> np.ndarray:
    # Parse camera state for camera-to-world matrix (c2w) and intrinsic (K) as
    # float64 numpy arrays.
    c2w = camera_state.c2w
    K = camera_state.get_K(img_wh)
    # Do your things and get an image as a uint8 numpy array.
    img = your_rendering_logic(...)
    return img

# Initialize a viser server and our viewer.
server = viser.ViserServer(verbose=False)
viewer = nerfview.Viewer(server=server, render_fn=render_fn, mode='rendering')

It will start a viser server and render the image from a camera that you can interact with.

Examples

We provide a few examples ranging from toy rendering to real-world NeRF training applications. Click on the dropdown to see more details. You can always ask for help message by the -h flag.

Rendering a dummy scene.
00.mov

This example is the best starting point to understand the basic API.

python examples/00_dummy_rendering.py
Rendering a dummy training process.
01.mov

This example is the best starting point to understand the API for training time update.

python examples/01_dummy_training.py
Rendering a mesh scene.
02.mov

This example showcases how to interactively render a mesh by directly serving rendering results from nvdiffrast.

# Only need to run once the first time.
bash examples/assets/download_dragon_mesh.sh
CUDA_VISIBLE_DEVICES=0 python examples/02_mesh_rendering.py
Rendering a pretrained 3DGS scene.
03_smaller.mov

This example showcases how to render a pretrained 3DGS model using gsplat. The scene is cropped such that it is smaller to download. It is essentially the simple viewer example, which we include here to be self-contained.

# Only need to run once the first time.
bash examples/assets/download_gsplat_ckpt.sh
CUDA_VISIBLE_DEVICES=0 python examples/03_gsplat_rendering.py \
    --ckpt results/garden/ckpts/ckpt_6999_crop.pt
Rendering a 3DGS training process.
04_smaller3.mov

This example showcases how to render while training 3DGS on mip-NeRF's garden scene using gsplat. It is essentially the simple trainer example, which we include here to be self-contained.

# Only need to run once the first time.
bash examples/assets/download_colmap_garden.sh
CUDA_VISIBLE_DEVICES=0 python examples/04_gsplat_training.py \
    --data_dir examples/assets/colmap_garden/ \
    --data_factor 8 \
    --result_dir results/garden/

Acknowledgement

This project cannot exist without the great work of nerfstudio and viser. We rely on nvdiffrast for the mesh example and gsplat for the 3DGS examples. We thank the authors for their great work and open-source spirit.