Skip to content
Karol Hausman edited this page Jan 30, 2014 · 11 revisions

Welcome to the interactive_perception wiki!

Interactive Perception

Interactive Perception is a holistic approach towards autonomous manipulation. It shifts the emphasis from individual subfields such as control, planning, reasoning, learning, and perception towards the implementation of task-specific capabilities.

In interactive perception, we understand the world based on functionality and behavior. As a result, manipulation becomes an integral part of perception. In other words, purposeful manipulation depends on the robot's ability to curiously explore its unknown environment through interaction.

Recent results show an increase in interest in interactive perception. These results include interactive perception methods for object segmentation, modeling, grasping, and even learning manipulation skills through interactive perception.

Interestingly, these results are beginning to appear independently in the different relevant communities: perception, grasping, manipulation, and learning. Our objective is to make available existing resources and knowledge for researchers. We invite you to join us by contributing relevant videos, publications to our webpage interactive perception and above all to share your implementation with the community by committing to this repository!

Interactive Perception Library (IPL)

##Installation IPL works with ROS Fuerte. Check out the repo and type:

rosmake plugin_manager push_point_example push_point_example2 static_segmentation_example

##How to run

roslaunch plugin_manager plugin_manager.launch

You will see a 3D visualizer with a loaded point cloud. There is also a dynamic_reconfigure window which serves as a GUI to pick steps and their specific implementations you want to use. First list picks the step you would like to run, whereas the second and third lists select the specific implementations of the steps.

It worth mentioning that the instances of push_point and static_segmentation implementations are loaded at run-time. In the terminal you should be able to see a ROS_INFO msg that specifies which implementation was ran.

This software shows how you can change different implementations of pipeline steps at run-time (as long as they follow the same interfaces, defined in the interactive_perception_interface package). It is a very simple example that shows the main functionality and can be extended to implement actual steps of the interactive perception pipeline.

Goal

The main goal of this library is to collect all the code from different laboratories being involved in the idea of interactive perception. We came up with a framework that enables to switch between different implementations of each step from the interactive perception pipeline at runtime.

Architecture

The library was created based on the abstract factory pattern and it is mainly using pluginlib from ROS. The main executable can be find in plugin_manager package. It shows a small user interface together with a visualization tool(visualizer package). The user interface shows possibility to call different steps and implementations of these steps by using dynamic_reconfigure package.

In order to run the main application that shows the usage please rosrun plugin_manager plugin_manager and open dynamic_reconfigure gui.

Interface

After conducting research in existing implementations we noticed many similar steps that became crucial for our library, those being:

  • Static Segmentation - to infer which parts of the scene are most likely being segmented incorrectly
  • Feature Extraction - to extract features that will be tracked
  • Push Point - to find the best point to interact with objects
  • Manipulation - to manipulate a robot in order to interact with objects in the scene
  • Tracker - to track previously extracted features during robot's movement
  • Trajectory Clustering - to cluster the trajectories of the features that have moved in the same manner
  • Full Reconstruction - to reconstruct a full model of the object. It takes the sparse representation - clustered features - and reconstructs a dense model of the object.

We created an interface for each of those steps in -interactive_perception_interface package-. In order to implement one of the steps please take a look at the provided examples:

  • static_segmentation_example
  • push_point_example
  • push_point_example2 Of course, feel free to add additional functions in the implementations of the steps but please stay consistent with the interface.

Communication between different steps/interfaces

In order to communicate between different steps in the interface users can use the plugin_manager package since all the steps share the same memory. Interfaces were developed such that the output of the previous step is the input of the next one. However, feel free to extend the interfaces if you have a good reason for it!

If there is a need to implement different communication between the steps users can also use normal ROS communication tools by implementing publishers/subscribers or services in the respective implementations of the step.

Questions?

In case of any problems or questions please feel free to contact us: