Skip to content

Single Camera Person Tracking

bsubbaraman edited this page Jan 27, 2020 · 4 revisions

Launch Tracking

To run only the person tracking module on a single camera, run the following: roslaunch tracking single_camera_tracking_node.launch enable_people_tracking:=true enable_object:=false enable_pose:=false sensor_type:=<SENSOR_TYPE>

where you should replace <SENSOR_TYPE> with kinect2, zed, realsense, or azure depending on your image (see below for additional notes on the Zed and Azure). Omitting the sensor_type parameter will default to a kinect2. Please note that the Zed, RealSense, and Azure are supported only on the newest version of OpenPTrack.

By default, all the modules are set to true. For example, you can launch all the modules with a kinectv2 by running:

roslaunch tracking single_camera_tracking_node.launch enable_pose:=true enable_people_tracking:=true enable_object:=true

or simply:

roslaunch tracking single_camera_tracking_node.launch

You can activate or deactivate modules by setting the corresponding flags to true or false.

Note: Depending on your graphic processor, running pose annotation with the other two modules may be too much for your node.

Notes on the Zed

Initial tests have found that our default person detection algorithm don't work well with the Zed camera. While we are working on this issue, we have added an option to perform YOLO-based person tracking- this will filter object detections to look for people. To use YOLO-based person tracking with a single camera Zed, set yolo_based_people_tracking:=true:

roslaunch tracking single_camera_tracking_node.launch enable_people_tracking:=true enable_object:=false enable_pose:=false yolo_based_people_tracking:=true sensor_type:=zed

Note that enable_people_tracking should also be set to true.

Notes on the Azure

OpenPTrack v2.2 provides preliminary support for the Kinect Azure. Currently, single camera person tracking is available - the object and pose modules are not.

Manual Ground Plane Detection

When running person tracking, the ground plane will be automatically detected. Sometimes due to a reflective or dark colored floor, tracking may not function properly or at all. This can also be solved by setting the ground plane manually. See here for the information.

Once the command is running, a rviz window will automatically open. Rviz is used for visualizing data in OPT. See here for instructions on visualizing data in OPT. You can choose your topics and subsequently save the configuration. OPT v2 comes with a default rviz configuration: SingleCameraTracking.rviz.

rviz configuration for single tracking

You also have the option to view the image as captured by the sensor by checking the SimpleImage option. Tracking + SimpleImage

Setting Up an OpenPTrack v2 System:

Running OpenPTrack v2:

Tracking GUI

How to receive tracking data in:

  1. Tested Hardware
  2. Network Configuration
  3. Imager Mounting and Placement
  4. Calibration in Practice
  5. Quick Start Example
  6. Imager Settings
  7. Manual Ground Plane
  8. Calibration Refinement (Person-Based)
  9. Calibration Refinement (Manual)

OPT on the NVidia Jetson

Clone this wiki locally