funny-things is a repository which contains, among other funny things, a set of modules for easy scripting and execution of non-interactive (pre-programmed) demos. These modules (
gaze.lua) are devised to perform a set of actions, including moving, gazing, blinking, face expressions, etc, which can be easily handled through
rpc commands. Moreover, these calls can be managed directly through the use of bash (
.sh) scripts, which allow easy synchronization, ordering and test of these calls so that they can be predefined for particular demos. Several examples are available in the repo.
- Clone https://github.com/robotology/funny-things
- Compile the project
cdinto the repo folder (
icubDemoScriptsSIMto work on the simulator) template, adapt them and save as apps.
- Open or create a new script file (
.sh) and modify or add any commands to suit your demo needs.
- Set robot environment (start robot or run
- If they are not running yet, open the emotions and speech applications:
iCubSpeech* Open the
icubDemoScriptsSIM) app and launch and connect modules. * On the terminal, go to the
- Run any desired command from the command line as
./<scriptname>.sh <command>- eg
- If new functionality is required, the easiest procedure is to copy an existing
.shwith another name (
tg2.shcan be a good starting point), and modify it to suit your needs.
CREATE_FUNNYTHINGSAPP at configure time, it is possible to build the
funny-things electron app, that allows to quickly and easily design awesome demos.
Here how to install the dependencies:
sudo apt update sudo apt install -y curl curl -fsSL https://deb.nodesource.com/setup_17.x | sudo -E bash - sudo apt install -y nodejs sudo npm install -g npm@latest
And here the
funnyThingsApp.AppImage in action:
Moreover, it is possibile to export the demo designed in the
.funnythings format(that can be loaded and eventually modified through the app), and
This app is just a higher layer for helping the user in designing the interaction with the robot, under the hood each action need that specific excutables are running:
- Movements/posture ->
cptServicefor the parts involved
- Speak ->
- Gaze movements ->
- Face expressions ->
For more info, see: