Skip to content

nilseuropa/ros_ncnn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ROS_NCNN

This is a ROS package for NCNN, a high-performance neural network inference framework - by Tencent - optimized for mobile platforms:

  • ARM NEON assembly level optimization
  • Sophisticated memory management and data structure design, very low memory footprint
  • Supports multi-core parallel computing acceleration
  • Supports GPU acceleration via the next-generation low-overhead Vulkan API
  • The overall library size is less than 700K, and can be easily reduced to less than 300K
  • Extensible model design, supports 8bit quantization and half-precision floating point storage
  • Can import caffe/pytorch/mxnet/onnx models

Setting up

Library

ROS package

  • Clone this repository into your catkin workspace.
  • Initialize and update submodule ncnn-assets ( this is a collection of some popular models )
  • Compile the workspace.
  • CMake script is going to autodetect whether the ncnn library is built with Vulkan or not. ( All nodes will utilize the GPU if Vulkan is enabled. )

General launch parameters

<node name="yolact_node" pkg="ros_ncnn" type="yolact_node" output="screen">
  <param name="display_output" value="$(arg display_output)"/>
  <remap from="/camera/image_raw" to="$(arg camera_topic)"/>
  <!-- Select discrete GPU, in any other case the node jumps to the first discrete GPU. -->
  <param name="gpu_device" value="0"/>
  <!-- Number of CPU threads to use, uses all available if not provided. -->
  <param name="num_threads" value="8"/>
  <!-- Turns engine.neuralnet.opt.use_vulkan_compute regardless the lib is built with GPU support -->
  <param name="enable_gpu" value="true"/>
</node>

YOLACT

Publisher

# Object message
Header header
Rectangle boundingbox # Vector2D position and size
string label
float32 probability

Params

  • probability_threshold - default 0.5 - above which objects are published

YOLO v2 / v3

The assets repository has multiple YOLO networks, choose the parameter and model file before launch. ( Default is YOLO-3 on MobileNet-2 )

Publisher

# Object message
Header header
Rectangle boundingbox # Vector2D position and size
string label
float32 probability

Params

  • model_file - YOLO network model file
  • param_file - YOLO network parameter file
  • probability_threshold - default 0.5 - above which objects are published

YOLO v5

Publisher

# Object message
Header header
Rectangle boundingbox # Vector2D position and size
string label
float32 probability

Params

  • model_file - YOLO network model file
  • param_file - YOLO network parameter file
  • probability_threshold - default 0.5 - above which objects are published

RetinaFace

Publisher

# FaceObject message
Header header
Rectangle boundingbox # Vector2D position and size
Vector2D[5] landmark # 5x 2x float32
float32 probability

Params

  • probability_threshold - default 0.5 - above which face objects are published

HopeNet

Using RetinaFace as face detector:

Publisher

# Euler angles
float32 roll
float32 pitch
float32 yaw

PoseNet

Faster R-CNN

Don't forget to uncompress ZF_faster_rcnn_final.bin.zip in assets directory first. ( but again, R-CNN is the past and that's neither a cat nor a bird right there... that's my best friend )

🚧 To do

  • General model loader node ( with layer to topic mapping through NDS file )
  • Dynamic reconfiguration for some params ( e.g. probability thresholds )

✌️ Acknowledgements

Special thanks to Nihui for her wonderful work.

About

ROS wrapper for NCNN neural inference framework

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published