The Complete Repository for Autonomous Robot Navigation and Control using Neural Radiance Fields (NeRFs). With quick and easy Docker setup.
Use torch_ngp_container to transform captured photos or a video of an environment into an accurately scaled digital model.
Then, use nerf_ws to set up collision-free path planning and controls using ROS2 for real-time autonomous navigation in the environment, with your robot.
This section covers how to install and set up nerf_autonomy -- To creating your own NeRF, and configure the ROS2 Nodes on it, please refer to the help folder
- System with an NVIDIA GPU (CUDA Capable System)
- At Least 80 GB of free space (For Docker Containers and Data)
- Install and setup Docker
- Install the NVIDIA Container Toolkit for containers to have access to GPU
- Install ROS2 Humble
Begin by cloning this repository
$ git clone https://github.com/mustafahamoody/nerf_autonomy
$ cd nerf_autonomy
Build and Start the Docker Containers using Docker Compose. Make sure the containers run as detached (-d flag)
This process may take a while depending on your system (15 - 30+ mins)
nerf_autonomy$ cd docker
nerf_autonomy/docker$ docker compose up -d # This will build and start both containers
Note for Linux Systems: If you receive a permission denied error, you may need to add your user to the docker group to access the docker daemon. -- For more information see this
$ sudo groupadd docker
$ sudo usermod -aG docker $USER
# Restart your system for changes to take effect. Then run docker compose up -d again
Enter the running container:
# To enter torch_ngp_container: For environment creation
nerf_autonomy/docker$ docker compose exec torch_ngp_container bash
# To enter nerf_ws: For starting ROS2 navigation nodes
nerf_autonomy/docker$ docker compose exec nerf_ws bash
You are now ready to create your own NeRF environments and use them for autonomous navigation!
To visualize the created NeRF environment using the torch-ngp GUI, you must connect your host machine display to the docker container using an X11 Server
For Linux Systems:
# 1. Install X11 Server
sudo apt-get install xorg openbox
# 2. Allow Docker to access your X server
xhost +local:docker
For Windows Systems -- UNTESTED:
- Download and install VcXsrv
- Run XLaunch and configure:
- Select "Multiple windows" mode.
- Select "Start no client".
- Enable "Disable access control" (optional, but helps with debugging).
- Click "Finish" to start the server.
To view the voxel, occupancy grid and path plan generated by nerf_ws (ROS2 Nodes) for your NeRF Environment, Setup a foxglove account and complete the steps below
- Install ROS Foxglove bridge
sudo apt install ros-humble-foxglove-bridge
- On the Foxglove website, Open connection using WebSocket. Please ensure your WebSocket URL matches: ws://localhost:8765
Run the nerf_autonomy demo and ensure everything is working
- Download the demo dataset from here
- Give your user access to the data folder
nerf_autonomy$ sudo chown -R $UID $GID data
-
Move it into nerf_autonomy/data
-
Open env_create.py (in the torch_ngp_container folder) and make sure that under DATA SETUP, content_path = "data/demo" and input_type = "image"
-
Start the containers and enter torch_ngp_container
-
To create the demo NeRF Environment run:
-
python env-create.pyand name the environment "demo"
This may take 1+ hours. Exiting the container will not stop this process. The container will stay running until you stop it or turn off your machine.
-
To view the environment NeRF once env_create finishes run:
python env_create.py -- viewand enter your environment name (demo)
- In the docker compose file, under nerf_ws environment, make sure MODEL_WEIGHTS_PATH=/nerf_ws/data/demo_nerf and MODEL_DATA_PATH=/nerf_ws/data/demo
- Start the container
- Open the WebSocket connection in Foxglove in your browser.
You should see a 2D occupancy grid with a path populated in the 3D viewport.
You just created an autonomous path for a real-life environment using nerf_autonomy!
If you use this work, please include the following citations
nerf_autonomy:
@misc{nerf_autonomy,
Authors = {Mustafa Hamoody, Kishore Yogaraj},
Year = {2025},
Note = {https://github.com/mustafahamoody/nerf_autonomy},
Title = {nerf_autonomy: Autonomous Robot Navigation Using Neural Radiance Fields}
nerf-navigation:
@article{nerf-nav,
author={Adamkiewicz, Michal and Chen, Timothy and Caccavale, Adam and Gardner, Rachel and Culbertson, Preston and Bohg, Jeannette and Schwager, Mac},
journal={IEEE Robotics and Automation Letters},
title={Vision-Only Robot Navigation in a Neural Radiance World},
year={2022},
volume={7},
number={2},
pages={4606-4613},
doi={10.1109/LRA.2022.3150497}}
torch-ngp:
@misc{torch-ngp,
Author = {Jiaxiang Tang},
Year = {2022},
Note = {https://github.com/ashawkey/torch-ngp},
Title = {Torch-ngp: a PyTorch implementation of instant-ngp}
}
Instant-NGP:
@article{mueller2022instant,
title = {Instant Neural Graphics Primitives with a Multiresolution Hash Encoding},
author = {Thomas M\"uller and Alex Evans and Christoph Schied and Alexander Keller},
journal = {arXiv:2201.05989},
year = {2022},
month = jan
}
original NeRF authors:
@article{mildenhall2020nerf,
title={NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis},
author={Ben Mildenhall and Pratul P. Srinivasan and Matthew Tancik and Jonathan T. Barron and Ravi Ramamoorthi and Ren Ng},
year={2020},
eprint={2003.08934},
archivePrefix={arXiv},
primaryClass={cs.CV}
}