Our system consist of 4 machines connected by local network including
- PC-A for robot trajectories planning & control
- PC-B for computer vision tasks
- UR5e robot arm (right side)
- UR5e robot arm (left side)
Clone this repository on both PC-A and PC-B
cd ~
git clone https://github.com/VISTEC-IST-ROBOTICS-PUBLIC/PBD
Copy computer vision package to your workspace on PC-B
mkdir -p ~/dev/src/ # Create workspace (if not exist)
cp ~/PBD/dev_ws/src/vlib ~/dev/src/
Since our system use both ROS1 and ROS2, the middleware is neccessary for communication between both side. We choose eProsima integration service for this task by its simplicity.
Use camera_calibration.yaml to launch the eProsima Integration Service during Robot ↔ Camera frame calibration.
# Open new session
# Source both ROS1, ROS2 and Integration Service
cd ~/is-workspace
source /opt/ros/noetic/setup.bash
source /opt/ros/foxy/setup.bash
source ~/is-workspace/install/setup.bash
# Launch the integration-service with prepared config file
integration-service $(PATH_TO)/camera_calibration.yaml
After installation we need to proceed Robot ↔ Camera frame calibration to get precise transformation of camera related to robot (
cd ~/dual_arm_driver/scripts/
python camera_frame_calibrate.py
Don't forget to setup ROS_MASTER_URI and ROS_IP before launch the integration-service
export ROS_MASTER_URI=http://192.168.1.11:11311 # Replace this with PC-A IP Address
export ROS_IP=192.168.1.46 # Replace this with PC-B IP Address
# New session
cd ~/dev_ws/ # Enter workspace
source /opt/ros/foxy/setup.bash # Source ROS2 installation
colcon build # Build the package
source install/setup.bash # Source package installation
ros2 launch vlib_core demo.launch.py # Launch Camera & Aruco Cube Pose Estimator Node
This node will order the robot arm to move to random poses for 25 times (configurable via calibration_rounds node parameter).
# New session
ros2 run vlib camera_frame_calibration.py
During calibration, this node will gather both
We already know
For each pair of
We use a script calibration_postprocess.py to do this process and calculate mean of
cd ~/PBD/hand_detector/
python hand_realsense_3D.py
cd ~/dual_arm_driver/scripts/
python A_dmp_hand.py
Detect pre-defined obstacle and publish it's 3D position
ros2 run vlib_core cup_detector.py # 3D object detector (cup)
Launch Robot bringup and run interface file
roslaunch dual_arm_driver dual_arm_bringup.launch
rosrun dual_arm_driver mux_joint_node.py
rosrun dual_arm_driver RobotVelocityInterface.py
Launch conveyor bringup and run file to command speed
roslaunch ros_conveyor conveyor_bringup.launch
cd ~/dual_arm_driver/scripts/
python conveyor_test.py
Run this node to use ADAP-DMPs
cd ~/dual_arm_driver/scripts/
python A_dmp_hand.py