The repository contains a simulation of a UR3 robot, prepared for training before the ERC 2022 competition. The simulation uses a UR3 arm model with a repository [ROS-Industrial Universal Robot repository] (https://github.com/ros-industrial/universal_robot). Cell model, IMU box, box with lid, buttons and a gripper with camera have been added. The gripper also uses a gazebo plugin to create mimic joints ([roboticsgroup_gazebo_plugins] (https://github.com/roboticsgroup/roboticsgroup_gazebo_plugins)). The files necessary for motion planning have also been modified to take into account the added elements. All changed and added files are located in this repository.
The gripper used was taken from the Printables website.
A block has been added to the gripper in the place where the plugs are. This will avoid collisions with plugs.
The simulation is mainly developed and tested on Ubuntu 20.04 Focal Fossa with ROS Noetic Ninjemys, so it is a recommended setup.
For the simulation to work properly, you must install dependencies and download repositories by running the following commands:
source /opt/ros/noetic/setup.bash
sudo apt-get update && apt-get upgrade -y && apt-get install -y lsb-core g++
sudo apt-get install git
rosdep init && rosdep update
sudo apt install ros-noetic-moveit -y
sudo apt install ros-noetic-ros-controllers* -y
mkdir -p /catkin_ws/src
cd /catkin_ws/src
git clone -b kinetic-devel https://github.com/ros-industrial/universal_robot.git
sudo rm -r universal_robot/ur_msgs
git clone https://github.com/roboticsgroup/roboticsgroup_gazebo_plugins
git clone https://github.com/Michal-Bidzinski/UR3_sim.git
cd /catkin_ws
catkin_make
source devel/setup.bash
To run UR3 simulation in Gazebo with MoveIt!, and RVzi GUI, including an example cell:
$ roslaunch ur3_sim simulation.launch
To run UR3 simulation in Gazebo with MoveIt!, and RVzi GUI, containing only a robot with a grapple and a camera (as per real setup):
$ roslaunch ur3_sim real_station.launch
To control the arm can by used MoveIt!. Planning robot movement can be performed using for example the Move Group Interface. Tutorials written in Python and C++ can be a hint (note the robot group name and number of joints).
The gripper is controlled by publishing appropriate commands on topic /gripper command. Message type: std msgs/String. To control the gripper, send the one of following commands:
- open
- semi_open (for catching the IMU box)
- semi_close (for catching the lid of the box)
- close
The simulation includes a camera placed on a gripper that detects aruco tags and other items. The following topics are published:
- /camera_image/camera_info
- /camera_image/image_raw
- /camera_image/image_raw/compressed
- /camera_image/image_raw/compressed/parameter_descriptions
- /camera_image/image_raw/compressed/parameter_updates
- /camera_image/image_raw/compressedDepth
- /camera_image/image_raw/compressedDepth/parameter_descriptions
- /camera_image/image_raw/compressedDepth/parameter_updates
- /camera_image/image_raw/theora
- /camera_image/image_raw/theora/parameter_descriptions
- /camera_image/image_raw/theora/parameter_updates
Although the topics with the word Depth in the name are published by defalut ros node, these cameras do not capture any depth data.