Experiments used for the paper
submitted to presented at IJCNN2016 / IEEE WCCI 2016
Diverse, Noisy and Parallel: a New Spiking Neural Network Approach for Humanoid Robot Control
How exactly our brain works is still an open question, but one thing seems to be clear: biological neural systems are computationally powerful, robust and noisy. Using the Reservoir Computing paradigm based on Spiking Neural Networks, also known as Liquid State Machines, we present results from a novel approach where diverse and noisy parallel reservoirs, totalling 3,000 modelled neurons, work together receiving the same averaged feedback. Inspired by the ideas of action learning and embodiment we use the safe and flexible industrial robot BAXTER in our experiments. The robot was taught to draw three different 2D shapes on top of a desk using a total of four joints. Together with the parallel approach, the same basic system was implemented in a serial way to compare it with our new method. The results show our parallel approach enables BAXTER to produce the trajectories to draw the learned shapes more accurately than the traditional serial one.
The trajectories are always closed shapes (otherwise the initial and final values are different and the signal conditioning must be changed)
- The trajectories are generated using a simulated BAXTER robot inside V-REP.
- /VREP_scenes/Baxter_IK_felt_pen_pick-and-place_learning_IJCNN2016.ttt (cell templates come from BEE_Simulator_ArmControl_VREP_trajectories-generator_v1-TEMPLATES.ipynb)
- Training data (output spikes) are generated using the notebook:
- BEE_Simulator_ArmControl_VREP_LSM_DATA-GENERATOR.ipynb (there's also a testing session at the end of the notebook)
- After the generation of the training data, it is necessary to train the readouts. This is done by:
- With all the readout weights defined, it's possible to verify the system using only the LSMs:
BEE SNN simulator:
Dynamic Time Warping:
Python scripts in general:
Final IEEE Xplore version: