Skip to content

yushiangw/factorednerf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Factored Neural Representation for Scene Understanding

[Project Website] [Arxiv] [Dataset (6GB)]

a b c


Installation:

  1. cd to the unzip directory

  2. build our docker image docker build -t factnerf -f Dockerfile .

  3. download our dataset and put it at $FACTNERF_ROOT/data

    $FACTNERF_ROOT/data/SYN
    $FACTNERF_ROOT/data/SYN/sce_a_train
    ...

Run in a Docker container:

export FACTNERF_ROOT=$(pwd)

# check if input data exists
ls $FACTNERF_ROOT/data

# set GPU
export CUDA_VISIBLE_DEVICES=0

Training

cd $FACTNERF_ROOT 
python framework/run_main.py -f configs/SYN/factorednerf/sce_a.yaml --mode train 

Rendering

#faster rendering using a smaller resolution
python framework/run_main.py -f configs/SYN/factorednerf/sce_a.yaml --mode render_valid_q  -c map__final --dw 4 --fnum 4 

# rendering (no downsampling)
python framework/run_main.py -f configs/SYN/factorednerf/sce_a.yaml --mode render_valid_q  -c map__final --dw 1 

Checkpoints

Please download the checkpoint file output-syn.zip and unzip to $FACTNERF_ROOT

Acknowledgement and Licenses

Some codes are adapted from the awesome repositories: NiceSlam and Neural Scene Graphs. We appreciated their efforts in open-sourcing their implementation. We also thank the authors of DeformingThings4D for allowing us to upload our synthetic dataset. Please be aware of all corresponding licenses.

Citation

@misc{wong2023factored,
      title={Factored Neural Representation for Scene Understanding}, 
      author={Yu-Shiang Wong and Niloy J. Mitra},
      year={2023},
      eprint={2304.10950},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}