An implementation of YOLOR running successfully on the NVIDIA Jetson Xavier NX edge computing device.
yolor-edge
was part of an engineering Honours Thesis.
The aim was to utilise state-of-the-art object detection on off-the-shelf hardware for assisting in urban search and rescue.
At the time (2021), YOLOR was one of the state-of-the-art real-time object detection models, achieving 55.4 mAP / 73.3 AP50 / 60.6 AP75 on the COCO dataset. See COCO test-dev Benchmark (Object Detection) | Papers with Code for more details.
Here, YOLOR is successfully implemented to run on an edge device, achieving real-time object detection.
The following occurs on the NVIDIA Jetson Xavier NX.
# Clone the repo
git clone https://github.com/ewth/yolor-edge.git
# Change into the repo directory
cd yolor-edge
# Build the Docker image
cd docker
./build.sh
# Run the Docker container
./run.sh
Any time you want to run the Docker container, execute run.sh
from the docker
directory.
From within the Docker container:
# Run yolor-edge
python3 yoloredge.py
If the Bash scripts will not run, try adding the +x
permission:
chmod +x ./run.sh
Alternatively, use bash
to execute directly:
bash run.sh
You Only Learn One Representation (YOLOR) is a novel, state-of-the-art object detection algorithm published in May 2021, and producing world-leading performance results.
YOLOR was published with an official implementation built on PyTorch, built and tested on a PC. The code was originally forked from: YOLOR in PyTorch.