Skip to content

Additional instructions

Birger Johansson edited this page Jan 30, 2024 · 9 revisions

These instructions are additional and not recommended for most users.

It is possible to adjust the robot setup as you need by modifying the underlying Ikaros framework https://github.com/ikaros-project/ikaros. It has modules for communication using YARP, MIDI and also over socket.

It is also possible to connect to the Wizard of Oz setup using an HTTP request.

Trigger a recorded motion using HTTP requests

This can be done by making an HTTP request to the ikaros web server.

http://127.0.0.1:8000/command/SR.trig/X/0/0

X is the sequence. 0 Is the first sequence.

The local IP number 127.0.0.1 can be replaced by an assigned network IP.

Controlling the motors/leds/sound directly using an external software using HTTP requests

To get direct control of the motors you need adjust the internal_control parameter for the SequenceRecorder in the /Robots/Epi/ExperimentSetup.ikg file. Setting the internal_control to 1 disable feedback from the motors and the sliders in the webUI is active instead. Changing the 6 channels to 1 will activate the sliders instead of physical movements of the robot head.

You can change the slider parameter using an HTTP request

http://localhost:8000/control/SR.positions/X/0/VALUE

X is the channel.

	0 = Neck tilt
	1 = Neck pan
	2 = Left eye
	3 = Right eye
	4 = Pupil left
	5 = Pupil right
	6 = Left arm joint 1 (from body)
	7 = Left arm joint 2 (from body)
	8 = Left arm joint 3 (from body)
	9 = Left arm joint 4 (from body)
	10 = Left arm joint 5 (from body)
	11 = Left hand
	12 = Right arm joint 1 (from body)
	13 = Right arm joint 2 (from body)
	14 = Right arm joint 3 (from body)
	15 = Right arm joint 4 (from body)
	16 = Right arm joint 5 (from body)
	17 = Right hand
	18 = Body
	19 = Eyes (r)
	20 = Eyes (g)
	21 = Eyes (b)
	22 = Eyes (int)
	23 = Eyes (r)
	24 = Eyes (g)
	25 = Eyes (b)
	26 = Eyes (int)
	27 = Sound

http://localhost:8000/control/SR.positions/0/0/30 Will make the robot tilt its head 30 degrees.

Control.epi.using.http.mov

Using the camera of the robot's eyes

Grabbing the camera stream is possible from an external software. There are two ways of doing this.

1. Get the camera stream from the raspberry Pi’s mounted in the eyes. The server providing the stream is the UV4L. More information about the stream can be found at https://www.linux-projects.org/

Left eye

http://192.168.10.2:8080

Right eye. To use the right eye, a second USB cable needs to be connected to the robot.

http://192.168.30.2:8080

2. Get a snapshot image from the ikaros system using an HTTP request.

RGB image in data:image/jpeg;base64 format

http://localhost:8000/update?data=Epi.LeftEye%23EYE.RED%2BGREEN%2BBLUE%3Argb

RGB image in data:image/jpeg;base64 format

http://localhost:8000/update?data=Epi.LeftEye%23LOW_INTENSITY.OUTPUT%3Agray