Skip to content

Created for the project related to the "Cognitive Robotics" subject at the University of Salerno.

Notifications You must be signed in to change notification settings

dev-guys-unisa/ContestCognitiveRobotics2020

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pepper Object Detection - Cognitive Robotics Project Work

pow ros

This is a repository created for the project work related to the "Cognitive Robotics" subject at the University of Salerno.


Group Members

Alt text

  • Salvatore Ventre
  • Vincenzo Russomanno
  • Giovanni Puzo
  • Vittorio Fina

Problem Description

Implement a ROS architecture that makes Pepper able to recognize the objects inside a scene and to describe the scene itself, by listing the different objects it has seen. The goal is, therefore, to make the robot able to look around, perform object detection and say aloud what it has managed to recognize.

Software Requirements

This task is made relying on the ROS (Robot Operating System) library, in particular the melodic distro. Moreover you need the following Python packages, which can be installed through the command pip install 'package_name':

tensorflow=2.2
pattern=3.6
opencv=4.4.0

Project Architecture

The implemented ROS architecture was designed to have 4 fundamental nodes to carry out the assigned task:

  • Node 1: Pepper has to acquire the stream from its camera.
  • Node 2: Pepper moves his head in three different position (front, right and left).
  • Node 3: Object Recognition Module.
  • Node 4: Pepper says what it saw, specifying where it saw what.

Object Detection & Recognition

We have chosen to use one of the pre-trained models on COCO Dataset, available to perform operations of this type. Among all those available here we have chosen the one that could seem the most performing, compromising with the processing time and the score in terms of COCO mAP. In the end, the model chosen was the efficientdet_d1_coco17_tpu-32 . Below we can see the details of the model:

Model name Speed (ms) COCO mAP Outputs
efficientdet_d1_coco17_tpu-32 54 38.4 Boxes

Clone the Repository

First of all it is necessary to clone the repository and the advice is to do it in the home directory, by running these commands:

cd ~
git clone https://github.com/dev-guys-unisa/ContestCognitiveRobotics2020

Getting Started as a Noob

We thought of everyone, because everyone can be able to see Pepper in action. In the utils/script folder there is a file called noobsetup.sh that must be launched with the command ./noobsetup.sh (obviously going to the script folder). At this point, have a coffee and wait for the execution of the script to finish.

Once finished, let's go back to the project directory and resource the repo to being able to run the launch file. So we have to do the following steps:

pwd # ~/ContestCognitiveRobotics2020
cd utils/script
./noobsetup.sh
# Have a coffee now and wait 'til the end
cd ../..
source devel/setup.bash

We are now ready to call the launch file and connect to Pepper. Jump to the Pepper Launch section.

Please remind that the repository must be cloned in the home directory in order to execute the script, otherwise follow the Pro Guide


Getting Started as a Pro

To be able to use this architecture it is necessary to have the Python pynaoqi libraries available by running some simple commands from the terminal. Here are some startup operations to perform, before starting pepper roslaunch.

Python NaoQi Download & Setup

We get the key and the sdk package from the following links.

sudo apt-key adv --keyserver 'hkp://keyserver.ubuntu.com:80' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654
sudo apt install -y ros-melodic-octomap ros-melodic-octomap-msgs ros-melodic-rgbd-launch ros-melodic-camera-info-manager-py

Now we can download the libraries needed to connect to Pepper in our workspace. So we need to download the Python package for NaoQi and export the necessary paths pointing to it.

cd ~ # Go to the home directory
wget https://community-static.aldebaran.com/resources/2.5.10/Python%20SDK/pynaoqi-python2.7-2.5.7.1-linux64.tar.gz
tar xf pynaoqi-python2.7-2.5.7.1-linux64.tar.gz
rm pynaoqi-python2.7-2.5.7.1-linux64.tar.gz
D=$(realpath pynaoqi-python2.7-2.5.7.1-linux64)

This operation allow us to use the PyNaoQi SDK to develop the nodes.

At this point we need to build the project, this is possible thanks to these commands:

cd ~/ContestCognitiveRobotics2020
catkin build
# Now wait 'til the end
source devel/setup.bash

How to launch a Demo

Here are some fundamental indications to launch a demo of the developed architecture.

Config file setup

The compiler will do its job correctly and it will therefore be necessary to modify the setup.bash file before you can start Pepper bringup. This can be done by launching the two commands from the terminal, making sure that the temporary variable $D is correctly set and points to the previously downloaded pynaoqi package:

cd ~/ContestCognitiveRobotics2020
echo 'export PYTHONPATH=${PYTHONPATH}:$D/lib/python2.7/site-packages' >> devel/setup.bash
echo 'export DYLD_LIBRARY_PATH=${DYLD_LIBRARY_PATH}:$D/lib' >> devel/setup.bash
source devel/setup.bash

Recall that it is necessary to link these two variables to the downloaded NaoQi SDK.

Pepper Launch

Now it is possible to launch a live demo to see Pepper's behavior live. To make things easier to run, we have decided to integrate the pepper_bringup command into a launch file built for our purpose so as to avoid endlessly opening and spawning terminal windows. The only thing to do, in fact, is to run the following command:

roslaunch pepper_launch pepper.launch pip:=*ipaddress* nao_ip:=*ipaddress*

As described in the related launch package this command will take care of launching all the nodes needed to run the demo without problems. Then every node foreseen by the architecture described above will be called.

Default values setted in Pepper Launch Files :

Arg Name Default Value Used For
pip 10.0.1.230 animated_say
nao_ip 10.0.1.230 pepper_bringup

Be Careful: Instead of *ipaddress* enter Pepper's ip address.

Hint: use the tab key to auto-complete commands


Group 18

About

Created for the project related to the "Cognitive Robotics" subject at the University of Salerno.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages