This is a ROS package presents a perception-to-manipulation system for picking thin objects from clutter. The manipulation consists of four steps: detecting the object of instance, approaching the target overhead, descending till the fingertip contacts the object, tilting to adjust the final grasp pose. Object detection is implemented on Mask R-CNN, a deep neural network for instance segmentation, while descending and tilting are implemented with tactile sensors.
- Universal Robot UR10
- Robotiq 140mm Adaptive parallel-jaw gripper
- RightHand Labs's TakkStrip
- Arduino
- Realsense SR300
- ROS Kinetic
- Driver for UR10 robot arms from universal robots
- Universal Robot package for ROS Kinetic
- MoveIt!
- Robotiq ROS package
- Mask R-CNN
.ipynb
files can be run in jupyter notebook. Other requirements please check carefully in Mask R-CNN repository.
The following instructions will help you build up the software step by step.
-
Follow the tutorial in Universal Robot package for ROS Kinetic and Robotiq ROS package to set up hardware properly.
-
Run Realsense SR300 camera in ROS. See link.
-
Connect Arduino and tactile sensor, output the sensor readings in ROS. See link.
-
Setup frames:
cd scripts
python frame_transform.py
-
Open a terminal, run object detection:
cd samples
jupyter notebook
Open
instance_segmentation.ipynb
. For loadingBLISTER_MODEL_PATH
, please refer to here. -
Open another terminal, run manipulation:
cd scripts
jupyter notebook
Open
thin_object_bin_pick_mani.ipynb
Zhekai Tong (ztong@connect.ust.hk) and Tierui He (theae@connect.ust.hk)