Skip to content

A particle-based encoding for Neural Radiance Fields

License

Notifications You must be signed in to change notification settings

jc211/ParticleNeRF

Repository files navigation

ParticleNeRF

ParticleNeRF: Particle Based Encoding for Online Neural Radiance Fields
Jad Abou-Chakra, Feras Dayoub, Niko Sunderhauf
Project page / Paper / Dataset 

ParticleNeRF is a fork of instant_ngp that uses a particle-based encoding to enable quick adaptation to dynamic objects. By associating features with particles in space, we can backpropagate the photometric reconstruction loss into the particles' position gradients, which can then be interpreted as velocity vectors. To handle collisions, a lightweight physics system governs the movement of the features, allowing them to move freely with the changing scene geometry. We demonstrate ParticleNeRF on various dynamic scenes containing translating, rotating, articulated, and deformable objects.

Method

A query point is sampled in space. The features and positions of the particles within a search radius are retrieved. The features and distances from the query point are used to interpolate the feature at the query point. The resulting feature is evaluated by the neural network to give color and density. To train the encoding, the loss gradients are backpropagated through the network, the query feature, and finally into the positions and features of the particles.

Instructions

instant-ngp$ ./instant-ngp data/nerf/fox --config configs/nerf/particle.json

Examples

License and Citation

@article{abou2022particlenerf,
  title={ParticleNeRF: Particle Based Encoding for Online Neural Radiance Fields},
  author={Abou-Chakra, Jad and Dayoub, Feras and S{\"u}nderhauf, Niko},
  journal={arXiv preprint arXiv:2211.04041},
  year={2022}
}

This work is made available under the Nvidia Source Code License-NC. Click here to view a copy of this license.