Skip to content

Library for Shape Registration suited for Transfer of Grasping Skills


Notifications You must be signed in to change notification settings


Folders and files

Last commit message
Last commit date

Latest commit



4 Commits

Repository files navigation

Shape Registration

Code accompanying the ICRA paper: "Transferring Grasping Skills to Novel Instances by Latent Space Non-Rigid Registration" [ Paper]

Project page:!/ShapeSpaceRegistration

This repository provides a tool for category-based, shape (latent) space, non-rigid registration.

  • Starting with just a number of instances belonging to your category, this framework is able to deform a choosen canonical instance into novel observed category instances.
  • The inference can be performed from a single view of the object and thus is suitable for on-line applications.
  • Generation of novel instances is also possible by interpolating and extrapolating between the training samples in the constructed shape space.
  • Integration with ROS, so the observed model can be directly defined from a ROS topic.

The major contributors of this repository include Diego Rodriguez, Florian Huber and Corbin Cogswell.

Overview Grasping

Citing Shape Registration

If you find Shape Registration useful in your research, please consider citing:

title     = {Transferring Grasping Skills to Novel Instances by Latent Space Non-Rigid Registration},
author    = {D. Rodriguez and C. Cogswell and S. Koo and S. Behnke},
booktitle = {IEEE Int. Conf. on Robotics and Automation (ICRA)},
year      = {2018}



  1. Install dependencies (replace distro by kinetic or melodic):
$ sudo apt install ros-<distro>-pcl-ros libceres-dev libvtk6-dev libqwt-qt5-dev
  1. Clone this repository into your ROS workspace
$ git clone
  1. Built it! We recommend to use catkin tools (python-catkin-tools):
$ catkin config --cmake-args -DCMAKE_BUILD_TYPE=Release -DCMAKE_CXX_FLAGS="-Wall -std=c++11"
$ catkin build shape_registration


You can play the video shape_space_code_release.mp4 to get a brief overview of the capabilities of the software.

General pipeline:

  1. Prapare the dataset of your category as Point Cloud Data (.pcd).

  2. Run the gui launch file:

    $ roslaunch shape_registration gui.launch
  3. Calculate deformation fields: Create a new category by clicking on the "Create new category" button and load your models by pressing the "load instances" button, Your instances should now be listed on the right side, the first and second one are displayed as the canonical (red) and the observed (green) instances. To calculate the deformation fields just click on "Calculate deformation fields". Save your progress with the "Save category" button.

  4. Load a category: Use the "Load category" button and select the folder named after the category you want to load. After this all your instances should appear, together with the results of the CPD.

  5. Find the Shape Space: Change to "Testing". Select the dimension of the shape space, then press the "Calculate PCA" button. The object displayed in blue shows the inferred model. You could visualize the contribution of each dimesnion of the shape space by using the interactive plot.

  6. Infer shape parameters of novel instances: Click on "Point Cloud File" or "Point Cloud Topic" to define an observed instance. After clicking "Fit to Observed", the latent space vector that matches the observed instance at best is found."

  7. Example Dataset: The folder dataset contains 11 point clouds of drills meant for training a category and 3 testing instances.


Code has been tested under:

  • Ubuntu 16.04, ROS Kinetic
  • Ubuntu 18.04, ROS Melodic

Stay tuned

Other works you might find interesting:


Library for Shape Registration suited for Transfer of Grasping Skills







No releases published


No packages published