Skip to content

cvg/FrontierNet

Repository files navigation

FrontierNet:
Learning Visual Cues to Explore

Boyang Sun · Hanzhi Chen · Stefan Leutenegger · Cesar Cadena · Marc Pollefeys · Hermann Blum

RA-L 2025
ArXiv | IEEE | Video | Webpage

example
FrontierNet learns to detect frontiers (the known–unknown boundary) and predict their information gains from visual appearance, enabling highly efficient autonomous exploration of unknown environments.

Quick Start

  • 🔧 Setup — Install dependencies and prepare the environment.
  • 🚀 Run the Demo — Try FrontierNet on example data(single image demo for now).
  • 🛠️ Pipeline Configurations — Customize your pipeline.

Setup

First clone the repository, install the dependencies and download model weights.

git clone --recursive  https://github.com/cvg/FrontierNet && cd FrontierNet
conda create -n frontiernet python=3.11 -y && conda activate frontiernet
pip install -r requirements.txt
bash download_weights.sh

Alternatively, download the checkpoint manually.

[Optional - click to expand]
  • Build and use UniK3D as depth priors (dependency should be already installed)
cd third_party/UniK3D/ && pip install -e .

Execution

Single Image Inferrence

Image from HM3D:

python demo_single_image.py --input_img examples/hm3d_1.jpg --out_dir output/ --config config/hm3d.yaml

Image from ScanNet++:

python demo_single_image.py --input_img examples/scannetpp_1.jpg --out_dir output/ --config config/scannetpp.yaml

Random Image (unknown camera):

python demo_single_image.py --input_img examples/internet_1.jpg --out_dir output/ --config config/any.yaml

By default, the pipeline uses Metric3Dv2 for depth. You can switch to UniK3D using:

... --depth_source UniK3D

Visualization

Visualize the output using:

python demo_plot.py --result_path output/<file_name>.npz

This first plots 2D result:

example

Then press any key to see 3D frontiers in the RGBD pointcloud:

example

Full-Scene Exploration

Instruction and demo for scene exploration will be released soon.

Pipeline Configurations

Pipeline Configuration for whole scene exploration will be released soon.

✅ TODO

  • Add Planning pipeline by August.
  • Add support of UniK3D
  • Add support of Metric3D

⚠️ Known Limitations

  • Performance may degrade in outdoor scenes or highly cluttered indoor environments.
  • Predictions are less reliable when objects are very close to the camera.

📖 Citation

If you use any ideas from the paper or code from this repo, please consider citing:

@article{boysun2025frontiernet,
  author={Sun, Boyang 
          and Chen, Hanzhi 
          and Leutenegger, Stefan 
          and Cadena, Cesar and 
          Pollefeys, Marc and 
          Blum, Hermann},
  journal={IEEE Robotics and Automation Letters}, 
  title={FrontierNet: Learning Visual Cues to Explore}, 
  year={2025},
  volume={10},
  number={7},
  pages={6576-6583},
  doi={10.1109/LRA.2025.3569122}
}

📬 Contact

For questions, feedback, or collaboration, feel free to reach out Boyang Sun:
📧 boysun@ethz.ch 🌐 boysun045.github.io

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published