A high-performance, GPU-accelerated multi-camera capture, streaming and recording application for Emergent Vision GigE cameras.
orange is built for high-throughput, time-synchronized multi-camera recording. Encoding is GPU-accelerated and scales with the number of GPUs in the host. PTP keeps cameras aligned to sub-frame precision, and a multi-host architecture (one GUI host coordinating any number of headless cam_server nodes over ENet) lets a recording rig scale beyond what a single machine can drive — both in camera count and aggregate pixel rate. Optional TensorRT-based YOLO detection runs on the live streams when a model is provided.
Full documentation — installation, system requirements, configuration, network mode, real-time detection, PTP — lives at the moments-behavior docs site.
Linux-only. Requires an NVIDIA GPU with NVENC. See the docs for full system requirements and the dependency install walkthrough.
git clone --recursive https://github.com/moments-behavior/orange.git
cd orange
./build.sh # builds release/orange, release/cam_server, release/yolo_offline
./run.sh # sudo release/orangeOrange is developed by Jinyao Yan, with contributions from Diptodip Deb, Wilson Chen, Ratan Othayoth, Jeremy Delahanty, and Rob Johnson.
Contact Jinyao Yan with questions about the software.
If you use Orange, please cite the software:
@software{moments_behavior_orange_2026,
author = {Yan, Jinyao and
Deb, Diptodip and
Chen, Wilson and
Othayoth, Ratan and
Delahanty, Jeremy and
Johnson, Rob},
title = {moments-behavior/orange: v2.1.0},
month = apr,
year = 2026,
publisher = {Zenodo},
version = {v2.1.0},
doi = {10.5281/zenodo.19688150},
url = {https://doi.org/10.5281/zenodo.19688150},
}Please open an issue for bug fixes or feature requests. If you wish to make changes to the source code, fork the repo and open a pull request.
