Skip to content

The source code of paper " Learning High-DOF Reaching-and-Grasping via Dynamic Representation of Gripper-Object Interaction"

Notifications You must be signed in to change notification settings

qijinshe/IBS-Grasping

Repository files navigation

This project is based on our SIGGRAPH2022 paper, Learning High-DOF Reaching-and-Grasping via Dynamic Representation of Gripper-Object Interaction

IBS-Grasping Teaser

Introduction

In this project, We adopt Interaction Bisector Surface (IBS), which is a surface composed of points at equal distances to two close by objects as the observation representation in learning high-DOF reach-and-grasp planning. We found that IBS is able and effective to guide the motion of the gripper.

Installation

This probject partly depends on ROS, you need to install ROS first (In our environment, we use ROS melodic).

The necessary Python packages are listed in the file "requirement.txt".

You can run the command to install them all:

pip install -r requirement.txt

"deepdifferentiablegrasp" is also important in our project. Here we only provide compiled files. Note that specific versions of 'Boost'(1.58) and 'Mosek'(9.0) are necessary to run these files.

Last, you need to compile ROS packages in the root directory:

catkin_make

Running

First, run devel/setup.bash to configurate environment setting:

source devel/setup.bash

Before test or training, you need to run the IBS computation service:

rosrun ibs_grasping ibs_env

For test:

rosrun ibs_env main.py --model_name [model_name]

For quick test:

rosrun ibs_env main.py --model_name [model_name] --quick

For trainning

rosrun ibs_env main.py --train_model

Pretrained models are also provided

Visualization

You can use RVIZ package to visualize the IBS. The configuration file is provided in the root directory (ibs_visualization.rviz).

Data Preparation

The objects used in this work can be download here, which includes 500 watertight objects collected from four datasets (KIT, GD, YCB, BIGBIRD) as well as their feasible grasps

The data processing script is provided in the root directory. The script will remove objects unsuitable for grasping and generate following files for you:

  • ".pcd": pointclouds sampled from origin meshes, used for IBS computation.
  • "_vhacd.obj" and ".urdf": approximate convex decomposition of objectsusing the VHACD algorithm and the URDF wrappers of objects, used for Pybullet Simulation.
  • ".bvh" (optional): precomputed files, used for grasp quality computation. If you don't want to train models or compute grasp quality, you don't need to generate these files.

Acknowledgments

Some codes are modified from some projects. Thanks for their excellent works:

pytorch-soft-actor-critic

AdaGrasp

deepdifferentiablegrasp (see "contact")

Citation

If you are interested in this work, please cite the following paper:

@article {she_sig22,
    title = {Learning High-DOF Reaching-and-Grasping via Dynamic Representation of Gripper-Object Interaction},
    author = {Qijin She and Ruizhen Hu and Juzhan Xu and Min Liu and Kai Xu and Hui Huang},
    journal = {ACM Transactions on Graphics (SIGGRAPH 2022)},
    volume = {41},
    number = {4},
    year = {2022}
}

License

The source code is released under GPLv3 license.

Contact

If you have any questions, feel free to email Qijin She (qijinshe@outlook.com)

If you want to get complete code of "deepdifferentiablegrasp", please contact Liu Min (gfsliumin@gmail.com)

About

The source code of paper " Learning High-DOF Reaching-and-Grasping via Dynamic Representation of Gripper-Object Interaction"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published