Skip to content

be2rlab/grasping_cell

Repository files navigation

Grasping Cell

iptv

Github All Releases GitHub last commit GitHub last commit

This repository contains the files of the International Laboratory of Biomechatronics and Energy Efficient Robotics of ITMO University project to create a universal platform for sorting known and unknown objects by cells. The system detects / classifies objects in space according to technical vision data, sorts them according to the cells corresponding to each object, moving in space with obstacles. The system also has a function of automated retraining, which allows you to study new objects and sort them without reconfiguring the software platform.

Overview

Hardware

The system is based on the manipulator Kuka LBR iiwa. This is a collaborative manipulator with 7 degrees of freedom, which is absolutely safe for a person and can work next to him without the risk of damage or damage. The vision system is based on a camera Intel Realsense D435i. A stereo and a depth cameras allow you to determine the shape, size of objects in space and their distances with great accuracy.

The list of harware

  • KUKA LBR iiwa 820
  • Intel RealSense D435
  • Gripper (this one looks pretty good)
  • PC with rt-kernel
  • PC with NVIDIA GPU
  • network switch
  • 5m usb type-c cable
  • Several Ethernet cables

Software

The software platform is based on a framework MoveIt! and consists of an object detection/classification module, a motion planning module, an object capture module, and an additional training module. Interaction between modules occurs through a finite state machine (FSM). Sorting mode work cycle: Go to start position -> Detection and classification of objects -> Detection of the nearest object and segmentation -> Generation of possible configurations of the manipulator for capturing objects -> Planning the movement of the robot from the current configuration to the configuration for capturing -> Moving to a new configuration - > Capturing the object -> Planning the movement to the cell -> Lowering the object into the cell -> Return to the starting position. The block diagram of the system operation is shown in the figure below:

flowchart

Interaction with the system occurs through the user interface (GUI interface), which enables/disables the system, as well as switching between modes of automated sorting and additional training of new objects. The finite state machine (FSM) sends requests to the modules and, based on the responses, determines the next actions of the system. The general architecture of the system is shown in the figure below:

architecture

How to use

Prerequisites

Installation

For PC with rt-kernel linux

  1. Setup network with 172.31.1.150/16
  2. Add to ~/.bashrc
export ROS_HOSTNAME=172.31.1.150
export ROS_IP=172.31.1.150
export ROS_MASTER_URI=http://172.31.1.150:11311
  1. Create and setup ros-workspace
  2. Install iiwa_stack

For PC with nvidia graphics

  1. Setup network with 172.31.1.151/16
  2. Add to ~/.bashrc
export ROS_HOSTNAME=172.31.1.151
export ROS_IP=172.31.1.151
export ROS_MASTER_URI=http://172.31.1.150:11311
  1. Create and setup ros-workspace
  2. Install IntelRealSenseSDK
  3. Install this repository
roscd
mkdir ../src
cd ../src
git clone --recursive git@github.com:be2rlab/grasping_cell.git
catkin build -j8
  1. (Optional) Set up a local network between the robot and the computer (not required to run the simulation on the computer).

  2. (Optional) Install the scheduler library OMPL with Modified Intelligent Bidirectional Fast Exploring Random Tree (you can use the OMPL scheduler library built into MoveIt! to work).

  3. Install grasp generating and object recognition modules

Using

Each step is recommended to be executed in separated terminal windows to monitor all modules states

  1. [for PC with rt-kernel] Start Kuka control system (we choose one of the following three commands (a, b or c) then we move to the instruction d):

a. In simulation (rviz):

roslaunch iiwa_moveit demo.launch

b. Or for execution in Gazebo:

roslaunch iiwa_moveit moveit_planning_execution.launch sim:=true

c. Or if we are connecting to a Real Robot, we use:

roslaunch iiwa_moveit moveit_planning_execution.launch sim:=false
  1. [for PC with rt-kernel] Start the pick and place node
roslaunch iiwa_move_group_interface move_group_interface_iiwa.launch
  1. [for PC with nvidia graphics] Start Object recognition module according to this.
  2. [for PC with nvidia graphics] Start Grasping generating moudle according to this.

Can be used!

Notes

Supported by

BE2RLab of ITMO University: the site and Github

Citation

@misc{grasping2022cell,
    author =   {},
    title =    {Universal platform for sorting known and unknown objects by cells},
    howpublished = {\url{https://github.com/be2rlab/grasping_cell}},
    year = {2022}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages