Skip to content

Viewpoints optimization based on point cloud input to maximize an environment coverage objective

License

Notifications You must be signed in to change notification settings

ctu-vras/trajectory_optimization

Repository files navigation

Perception-aware Trajectory Optimization

RA-L Slides VIdeo

Perception aware trajectory optimization based on point cloud visibility estimation in a camera frustum. The package is implemented as a ROS node. For more information, please, have a look at the project: https://github.com/tpet/rpz_planning.

Installation

Please, follow installation instructions in INSTALL.md

Before running the examples, download the data and place it to the package root: ./data

Running

Point cloud Visibility Estimation

Ones the package is installed, run the launch file and specify the bag file location:

roslaunch trajectory_optimization pointcloud_processor.launch
rosbag play PATH_TO_BAG_FILE -r 5 --clock

Replace PATH_TO_BAG_FILE with the path to the bag file, for example: ./data/josef_2019-06-06-13-58-12_proc_0.1m.bag

In this example, the purple points are the whole cloud in a camera frame, grey ones are only the visible points (not occluded by other points from the camera perspective). The hidden points removal (HPR) algorithm implementation is based on the article Katz et al. The resultant point cloud rendering on an image plane is done with pytorch3d library.

In this example, the point cloud visibility is estimated for each individual camera (in its field of view and the distance range) separately. The combined point cloud is then visualized in the robot base_link frame.

Position Optimization

Ego-pose optimization based on the observed in camera frustum point cloud visibility estimation. In this example, the points color encodes a distance-based (to camera frame) reward. The white points are currently observed ones by camera.

roslaunch trajectory_optimization pose_optimization.launch

Waypoints Optimization

Camera pose (X, Y and Yaw) optimization is consequently applied here for each separate sampled way-point of an initial trajectory.

Trajectory Evaluation

A camera trajectory could be evaluated by a number of observed voxels (points in the cloud). Single pose visibility estimation rewards are combined using log odds representation as it is done in OctoMap paper.

Trajectory Optimization

Based on the evaluation result, the trajectory (consisting of several waypoints) is optimized with the goal to increase overal visibility score.

roslaunch trajectory_optimization trajectory_optimization.launch

Reference

Trajectory Optimization using Learned Robot-Terrain Interaction Model in Exploration of Large Subterranean Environments

@ARTICLE{9699042,
  author={Agishev, Ruslan and Petříček, Tomáš and Zimmermann, Karel},
  journal={IEEE Robotics and Automation Letters}, 
  title={Trajectory Optimization Using Learned Robot-Terrain Interaction Model in Exploration of Large Subterranean Environments}, 
  year={2022},
  volume={7},
  number={2},
  pages={3365-3371},
  doi={10.1109/LRA.2022.3147332}
}

About

Viewpoints optimization based on point cloud input to maximize an environment coverage objective

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages