Skip to content
Alessio Buccino edited this page Aug 23, 2019 · 30 revisions

Welcome to the tracking-plugin wiki!

The tracking-plugin repository contains everything you need to integrate animal tracking in open-field experiments in the Open Ephys system (http://www.open-ephys.org/). We also provide a Bonsai script (http://bonsai-rx.org/) to extract LED position from a Pointgrey camera and input the tracking data in the Open Ephys GUI. Bonsai runs only on windows computers, so but you can use the open ephys on linux with a different tracking software.

System Overview

In the figure we show a block diagram of the system. The animal needs to be equipped with recording electrodes and two LEDs. The electrophysiology data are recorded with the Open Ephys acquisition board (via the Intan headstage) and sent to the Open Ephys GUI. The tracking data are measured with a PointGrey camera and processed with Bonsai, which sends positions to the Open Ephys GUI. For closed-loop experiments, the Open Ephys GUI controls the Pulse Pal stimulator, whose pulses are also recorded using the Open Ephys I/O board. Synchronization between tracking and electrophysiology is performed by recording the camera shutter events using the Open Ephys I/O board.

Installation

To install the plugin, copy the Tracking folder in the Sources/Plugins folder of the Open Ephys GUI and follow these steps: Installation - Open Ephys GUI

To build the Tracking plugin follow these instructions (https://open-ephys.atlassian.net/wiki/spaces/OEW/pages/46596122/Plugin+build+files).

If you are using Visual Studio 2013, in order to build the Tracking Port module, that uses windows sockets library, you might need to manually link to two libraries. From the Project Explorer, right click on the Tracking plugin and select Properties->Linker->Input->Additional Dependencies and add the following libraries:

Ws2_32.lib
Winmm.lib

Windows binaries

We also provide pre-compiled 64bit binaries in the Binaries folder. You can unzip the Open-Ephys-Tracking_win_Releasex64.zip in the Binaries folder and run the open-ephys.exe in the open-ephys folder. The Tracking plugin .dll is included and you should be able to add the Tracking modules to the Open Ephys signal chain. In the bonsai folder there are the bonsai scripts.

Tracking plugin

Tracking Port

The Tracking Port plugin allows to stream tracking data into the GUI. The data are streamed from an OSC (Open Sound Message) server and each packet contains 4 floats: x, y positions, width, and height of the field. Add and delete new sources using the + and - buttons. Set the port, address, and choose the color from the dropdown list of colors available.

Note that the Tracking Port only support 'localhost' at the moment.

Tracking Visualizer

The Tracking Visualizer displays the tracking data received from the Tracking Port (or any other plugin sending Tracking Data binary events) in real-time. The available Tracking Data sources are shown in the Sources list box on the left and multiple selection is allowed. The clear button clears the path trajectories. Note that by clicking the clear button the data are not lost, as they are saved by the Tracking Port plugin.

In the figure we show a simulated spiral-like trajectory.

Tracking Stimulator

The Tracking Stimulator plugin allows to perform tracking-based closed-loop experiments. It is originally design to stimulate place cells and grid cells inside or outside their field with electrical or optogenetics stimulation.

The user can manually draw, drag, resize (double click), copy (ctrl+c), paste (ctrl+v), and delete (del) circles, or can input circles information (x_position, y_position and radius - from 0 to 1) using the editable labels on the right. Each circle can be inactivated using the ON toggle button.

On the top right, the user can decide which source to track - Input source dropdown list - and which TTL output channel to trigger - Output channel dropdown list - when the current position is within the selected circles.

Stimulation is triggered ONLY when the toggle button in the editor (bottom left) is set to ON.

When within the selected circles, the tracking cue becomes red and TTL events are generated on the selected Output channel. There are two operation modes controlled by the buttons UNIFORM - GAUSS:

  • UNIFORM stimulation: a TTL train with a constant frequency fmax (user-defined) is generated when the position is within selected regions. In this mode, the colors of the circles are uniformly orange/yellow.

  • GAUSSIAN stimulation: the frequency of the TTL train is gaussian modulated. When the position is in the center of each circle, the frequency is fmax, when it is on the border of a circle the frequency is sd * fmax. In the case displayed in the following figure, the frequency in the center is 2 Hz and the frequency on the borders is 0.5 * 2 = 1 Hz. In this mode, the colors of the circles are graded, darker in the center and lighter on the borders.

Camera setup

The Flea3 PointGrey camera (https://www.ptgrey.com/) needs to be connected to the workstation via USB port. (Other PointGrey cameras such as Spinnaker are supported by Bonsai).

To download the drivers and set-up the camera, download the FlyCapture (or Spinnaker SDK) from https://www.ptgrey.com/support/downloads (select your camera model and OS).

IMPORTANT: the Bonsai FlyCapture package has combatibility issues with the FlyCapture SDK version > 2.11. Intall the SDK 2.11 from here: https://groups.google.com/forum/#!msg/bonsai-users/Wq2Bo1DnCD8/jb0BfvIVAgAJ

From the FlyCapture software, you can activate GPIO triggers that can be recorded with the Open Ephys board for precise synchronization, and change the frame rate.

Bonsai

The Bonsai scripts that we provide are relatively plug-and-play. Install Bonsai and start tracking.bonsai script. You will need to install the following libraries with the package manager:

  • Vision Library
  • Vision Design Library
  • PointGrey
  • Osc Library
  • Osc Design Library
  • Scripting Library

When you launch the tracking.bonsai script this is the workflow. It basically processes the image to extract a green and a red LEDs. If you have different colors you can change the Green and Red nodes. Then the centroid of the largest area of green and red pixels are packed together and sent via OSC signals (Open sound Control).

We provide a second script (osc.bonsai) which sets up two OSC ports:

  • RedPort: port=27020, address='/red', ip_address='localhost'
  • GreenPort: port=27021, address='/green', ip_address='localhost'

You can change, add, remove ports from the right panel as shown in the next figure.

Press play to start acquiring camera frames and to send them to the Open Ephys GUI.

Pyopenephys python package

We suggest to you record your data using the binary RecordEngine in the Open Ephys. For that we provide a Python library, in the called pyopenephys.py.

In order to install it run these commands from the terminal (it requires quantities, numpy, scipy, xmljson and matplotlib packages):

cd py-open-ephys

python setup.py install

It takes only a few lines of code to parse a recorded Open Ephys binary file:

import pyopenephys
file = pyopenephys.File(r"path-to-recording-folder") 
# experiment 1 (0 in Python)
experiment1 = file.experiments[0]
# recording 1 
recording1 = experiment1.recordings[0]

analog_signals = recording1.analog_signals
events_data = recording1.events
tracking_data = recording1.tracking

# plot tracking source 1
import matplotlib.pylab as plt
source_1 = tracking_data[0]
plt.plot(source_1.x, source_1.y)
plt.show()

References

Our article is available here: http://iopscience.iop.org/article/10.1088/1741-2552/aacf45/meta


  	
@article{1741-2552-15-5-055002,
  author={Alessio Paolo Buccino and Mikkel Elle Lepperød and Svenn-Arne Dragly and Philipp Häfliger and Marianne Fyhn and Torkel Hafting},
  title={Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai},
  journal={Journal of Neural Engineering},
  volume={15},
  number={5},
  pages={055002},
  url={http://stacks.iop.org/1741-2552/15/i=5/a=055002},
  year={2018},
}