Motor control using inverse model learned via FOLLOW
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.gitignore
LICENSE
README.md
acrobot_2link.py
arm_1link_gravity.py
arm_1link_gravity_interpol.py
arm_2link_gravity_interpol.py
arm_2link_todorov.py
arm_2link_todorov_gravity.py
control_inverse_Ddiff_robot_nengo_ocl.py
control_inverse_diff_robot_nengo_ocl.py
control_inverse_robot_nengo_ocl.py
control_robot_nengo_ocl.py
generate_arm_trajectory.py
generate_arm_trajectory_v2.py
input_ff_rec_robot_nengo_directu_ocl.py
input_general_robot_nengo_directu_ocl.py
input_rec_transform_nengo_plot.py
input_rec_transform_nengo_plot_figs.py
inverse_Ddiff_ff_robot_nengo_ocl.py
inverse_Mdiff_ff_robot_nengo_ocl.py
inverse_diff-ff_rec_robot_nengo_ocl.py
inverse_diff_ff_robot_nengo_ocl.py
inverse_diff_ff_robot_nengo_ocl_DL.py
inverse_diff_sameff_robot_nengo_ocl.py
inverse_ff_rec_robot_nengo_ocl.py
inverse_ff_robot_nengo_ocl.py
inverse_rec_robot_nengo_ocl.py
inverse_rec_robot_nengo_ocl_goodencoders.py
inverse_rec_robot_nengo_ocl_obsolete.py
plot_utils.py
sim_robot.py

README.md

FOLLOWControl

Code for
Non-Linear Motor Control by Local Learning in Spiking Neural Networks
Aditya Gilra, Wulfram Gerstner; Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1768-1777, 2018.
Earlier preprint at https://arxiv.org/abs/1712.10158.

First, learn the inverse model using the FOLLOW learning scheme introduced earlier in Gilra and Gerstner, eLife 2017;6:e28295 -- see also https://github.com/adityagilra/FOLLOW.

Then use the inverse model to control an arm to reproduce a desired trajectory.

0. Installation

apt-get --assume-yes install python-numpy python-scipy python-matplotlib git python-pip
You need to install opencl - this would be GPU specific perhaps. For my GPUs, I need:
apt-get --assume-yes install ocl-icd-opencl-dev nvidia-361 nvidia-settings nvidia-opencl-icd-361 libffi-dev
Nengo-ocl and dependencies should ger installed by:
pip install nengo-ocl
Else see developer installation at: https://github.com/nengo/nengo-ocl

The learning rate in inverse_diff_ff_robot_nengo_ocl.py is set to 2e-4 (valid for nengo version >= 2.5.0 at the time of writing -- earlier versions require higher 1e-3 learning rate, see my notes in the FOLLOW README ).
PES_learning_rate_FF = 2e-4 PES_learning_rate_REC = 2e-4

1. Inverse model

Set the GPU using: export PYOPENCL_CTX=':0' Learn the inverse model using FOLLOW learning via motor babbling. The differential feedforward network architecture is used.
nohup python inverse_diff_ff_robot_nengo_ocl.py &> nohup.out &
Other similarly named script files are for other architectures explored in the paper.

These scripts import sim_robot.py and arm*.py for simulating the 'true' arm dynamics.
They save in separate files: the variables monitored during learning and the final weights.

2. Inverse model for motor control

Load the pre-learned weights file and a desired trajectory and use it to control the true arm.
First generate the desired trajectory (see settings for 'zigzag' and 'diamond' within the file):
python generate_arm_trajectory_v2.py

Then run the control simulation with the differential feedforward network architecture:
python control_inverse_diff_robot_nengo_ocl.py
This file loads in previous weights (filename set in above script file), and desired trajectory (filename set in above script file), into the differential feedforward network, and then builds some extra feedback architecture to control the arm. Finally simulates the network and the true arm and saves the simulation variables.

Some other files exist for using the forward model for control. This direction didn't work out and was abandoned.

3. Figures

The figures in the paper require multiple simulation runs and collating the results together. Have a look at the end of input_rec_transform_nengo_plot_figs.py for the functional calls that create the figures. Their arguments are the names of the data files that must be loaded and plotted. The filenames contain the architectures and parameters for that data and these should be matched with the name of the data file created by the scripts using the relevant architecture. The data files are not provided here as they are too large. They can be generated by running various simulations as per parameters in the filenames embedded in input_rec_transform_nengo_plot_figs.py .