Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
a PhD project (ends in 2011) for concept learning and Human-Robot interaction. Eventually this could be the basis for further community-driven personal robot systems (GPL).
Python Other
Failed to load latest commit information.
HRI added parameter video device index for selecting a specific camera
common merge from integration
control added parameter video device index for selecting a specific camera
experiments Use conf for extraction setting
.gitignore better face init
COPYING used the more standard naming scheme
RATIONALE added info about control/
README restricting use of the software to academic purposes
lightHead added option for windowed mode + cosmetics
lightHead-window added option for windowed mode + cosmetics
source_me_to_set_env.sh added option for windowed mode + cosmetics
start_face.bat more verbose doc + more explicit file names
start_face.sh

README

Abstract:
=========

This software is part of the CONCEPT project from the University of Plymouth, under supervision of Dr tony Belpaeme.

The team involves :
- frederic Delaunay, human-robot interaction
- joachim De Greeff, conceptual modeling

The project is part of frederic Delaunay's PhD, under supervision of Dr tony Belpaeme.
This PhD is expected to end in late 2011.

In the meantime, you are welcomed to try the software and submit bugfixes as
long as you use this software for academic purposes. Other uses can only be
granted by Frederic Delaunay (frederic.delaunay at plymouth.ac.uk) .
This license restriction applies in conjunction with the GPL license provided 
with this software.

More information at http://www.tech.plym.ac.uk/SoCCE/CONCEPT/


Requirements (In this order):
=============

- python 2.6
- blender 2.49b
- for Windows, readline (if you want to use the readline client, http://newcenturycomputers.net/projects/download.cgi/Readline-1.7.win32-py2.6.exe)


Running the software:
=====================

Windows:
-------
Try to achieve step 2.2 using blender. File to load is in HRI/face/blender/ .
Then run start_face.bat


Open people would just:
----------------------

1) Open a shell and change into the directory containing this file.

2) Setting environment variables...

  2.1) you can load environment variables (supposedly for development):

       $ . source_me_to_set_env.sh
	or using bash:
       $ source source_me_to_set_env.sh

  2.2) open the blend file to generate a standalone version for your platform:

       $ edit_face

       Use blender menu: File > Save Game as Runtime, and save it as: "lightHead"

3) start the facial animation system with:
 
   $ ./start_face.sh

   By default the graphical interface is in full-screen; use option -w to enjoy window mode.


Now you can connect and start setting Action Units according to the protocol definition.


Contact: frederic.delaunay@plymouth.ac.uk
========
Something went wrong with that request. Please try again.