Skip to content

event-driven-robotics/neutouch_summer_school_contour

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

59 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NeuTouch Summer School 2021
Contour following with iCub

In this exercise, we challenge you to design a controller to follow a contour using the right index fingertip of the iCub humanoid robot in simulation. The adopted platform is Gazebo where both the robot and suitable tactile sensors are simulated. We provide a full environment to execute the exercise via a Docker image.

Summary


How to install the software

Click to open
  1. Clone this repository:
    git clone https://github.com/event-driven-robotics/neutouch_summer_school_contour.git
  2. Pull the docker image:
    docker pull 2103simon/contour_following:latest
  3. Create a docker container (using an NVIDIA GPU):
    cd neutouch_summer_school_contour
    xhost +
    docker run -it --name contour_following \
               -e DISPLAY=$DISPLAY \
               -v /dev/dri:/dev/dri \
               -v $PWD:/neutouch_summer_school_contour \
               -v /tmp/.X11-unix:/tmp/.X11-unix \
               --runtime=nvidia \
               -e NVIDIA_DRIVER_CAPABILITIES=graphics \
               2103simon/contour_following:latest

    Note: it is important to cd inside the cloned repository neutouch_summer_school_contourin order to create the container succesfully using the command above (otherwise $PWD will not contain the correct path.)

If you have problems running the command due to Nvidia-related issues you might want to have a look at these instructions.

Should you need to attach to the container you have created at any time, you can always use (this comes in handy if you need to open more the one terminal inside the container):

docker exec -it contour_following bash

If for any reason the container is not running (you will receive an error in such case), it can be started again using:

docker start contour_following

How to use the simulation environment

Click to open
  1. Open one terminal inside the container and run the YARP server using yarpserver --write
  2. Open a second terminal inside the container and run the YARP manager using yarpmanager:
    1. Select the application iCub_Contour_following
    2. Press on the green button Run all

    1. After some seconds, you should see the following environment:

Should you need to stop the environment, you can use the red button Stop all. If gazebo does not close after a while, you can kill it using killall -9 gzserver gzclient (even outside the docker container).

How to pause the simulation

If you need to pause the simulation, e.g. to save resources, press the backspace on the keyboard (while the focus is on the Gazebo window) or press the pause button in the Gazebo windows.

How to change the contour

We provide several contours you can use in the exercise. They can be listed here. To change the contour please proceed as follows:

  1. Visualize the contour to see its shape by opening in your browser the .STL mesh file. E.g. for the shape circle_2_5d try to visualize circle_2_5d.stl

  2. Stop the simulation if running

  3. Run the following from within the container:

   cd /usr/local/src/icub_haptic_exploration_environment/build
   nano ../environment/worlds/he_scenario.sdf
  1. Change line 40 to model://cf_<name> where <name> is e.g. circle_2_5d

  2. Change line 51 to to_be_followed::<name>::<name>_root_link

  3. Save and close nano

  4. Run make install

After that, you can restart the simulation environment and play with the new shape.


How to code and build the controller

Click to open

We provide a starting point for you in the C++ file contour_following.cpp. The code will initialize the iCub Cartesian controller that you can use to send 6D pose (or velocity) references (both Cartesian position and orientation) to the right index fingertip.

You can edit the source file locally from your OS using your favourite editor as the repository has been cloned outside the docker container. In order to build the code, instead, you should act within the container as follows.

Open a terminal inside the container and run:

cd /neutouch_summer_school_contour
mkdir build
cd build
cmake ../
make
  • The resulting executable contour_following can be run using ./contour_following. Please first run the simulation environment, otherwise the executable will not be able to connect to the robot.
  • The module will first close all the fingers, except the right index and thumb, and then will move the right hand to an initial pose. After that, the code within ContourFollowingModule::updateModule() will be executed periodically.

Useful tips

Click to open

Code structure

Click to open

The code is implemented as a standalone class ContourFollowingModule:

  • The module gets configured within the method ContourFollowingModule::configure().

  • A periodic method is called automatically every ContourFollowingModule::getPeriod() seconds, namely ContourFollowingModule::updateModule(). This is the place where you can add your code.

Bear in mind that the method is called periodically and goes out of scope at the end of each run. Should you need to store any data outside of the scope of a single method update, you will need to store it in a class member variable.


Sensors input

Click to open

Sensors simulate the behavior of the iCub humanoid robot tactile sensors that are present on the fingertips. Each fingertip is equipped with 12 taxels that provide a measure of the pressure exerted on them.

  • Each taxel is associated with an ID as follows:

  • You can visualize the pressure of the taxel with id <ID> as follows if the contour_following module is running. From within the docker run:

     yarpscope --remote /taxels_output:o --index <ID>

  • You can access to the pressure of each taxel as follows:

    ...
    // This is an excerpt of the contour_following.cpp file
    ...
    bool updateModule()
    {
        iCub::skinDynLib::skinContactList *input = skinEventsPort.read(false);
    
        if (input != nullptr)
        {
          // input is a std::vector of iCub::skinDynLib::skinContact contacts
          for (const iCub::skinDynLib::skinContact& skin_contact : *input)
          {
            // Each contact might contain several taxels activations
            // However, in the current implementation a contact always contains a single taxel
    
            // To get the the ID of the taxel use
            const int taxel_id = skin_contact.getTaxelList()[0];
    
            // To get the pressure associated to it use
            const double pressure = skin_contact.getPressure();
          }
        }
    }
    

Robot control

Click to open

In order to move the fingertip of the right index finger of the robot, you will be using the iCub Cartesian Interface (high level description API).

The controller is accessible within the ContourFollowingModule::updateModule() using the class member variable cartControl of type ICartesianControl*.

  • the initial pose of the hand is commanded within ContourFollowingModule::configure():
bool configure(ResourceFinder &rf)
{
   ...
   yarp::sig::Vector x0{-0.4, 0.13, -0.09};
   ...
   cartControl->goToPoseSync(x0, orientation_0, 3.0);
   cartControl->waitMotionDone(wait_ping, wait_tmo);
   ...
}
  • the responsiveness of the controller (i.e. how fast it will react to the references you sent) is decided using the method setTrajTime(). The smaller the trajectory time, the faster the response. The default trajectory time for the exercise is decided within ContourFollowingModule::configure():
bool configure(ResourceFinder &rf)
{
  ...
  cartControl->setTrajTime(1.0);
  ...
}
  • the reference frame that is commanded by the controller is usually the iCub hand palm (check the figure here). We moved the reference frame to the right index fingertip, so that you can command its pose directly. Bear in mind that the frame orientation has not been changed instead.

Useful methods we suggest to check out on the API are :

  • goToPoseSync which moves the end-effector to a given 6D pose (and does not return until the motion is completed)
  • goToPose same as above, but does not wait (useful for streaming commands to the controller)
  • getPose get the current position and orientation of the end-effector, i.e. the fingertip

Commands goToPose[Sync] require, as second argument, the orientation of the frame as a 4-dimensional yarp::sig::Vector vector containing the axis/angle representation of the rotation. We provided a suitable configuration for the orientation in the variable orientation_0 (a class member variable) so that you can focus on deciding the position of the fingertip.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published