This repository accompanies the paper "Ring Attractors as the Basis for a Biomimetic Navigation System" by Thomas C. Knowles, Anna Summerton, James G.H. Whiting and Martin J. Pearson.
Published in Biomimetics at https://doi.org/10.3390/biomimetics8050399
This animation illustrates the model produced. It consists of three spiking Ring Attractors that together model a robot's planar translation. Each ring attractor is sensitive to a component of velocity, with their ring state mapping to coordinates in (x, y) to form the green trace. The goal of the system is to follow the ground truth robot trajectory (blue trace) as closely as possible.
This is supported by sensory data from the robot, synthesised into multisensory `experiences' by a Predictive Coding Network (PCN). The uncorrected model (green error line) can track the trajectory, but is subject to drift. The PCN-corrected model can use sensory data to compensate for drift, by recalling these prior experiences and their locations. Each Cartesian coordinate maps to a given ring state, and vice-versa, providing targets on the rings for corrective input.
The robot trajectories and the PCN experiences used to trigger corrective input can be found here.
The rat trajectory data used to generate Figure 2 is taken from Sargolini et al. (2006)
Results generated in the course of this study can be found in the Results folder.
- Ubuntu 20.04.4 LTS (native or WSL)
- Python 3 installed
- Jupyter Lab install
-
Download the PCN experience files listed under Data
-
Use the environment.yml to create a new virtual environment
-
Run the following:-
- Uncorrected.ipynb
- Multimodal.ipynb
- Visual.ipynb
- Tactile.ipynb
- Power Consumption.ipynb
-
Run Statistics.ipynb to generate the results and accompanying Figures 4, 5 and 6.
Kyle McDonald, for his helpful Gist, built upon to form part of the Ring-to-Cartesian transformation
Rachael Stentiford, who designed the original Ring Attractor model and virtual environment