Skip to content
Ulrich Stern edited this page Sep 6, 2019 · 22 revisions


High-throughput behavior-dependent optogenetic stimulation of Drosophila.


SkinnerTrax allows tracking the positions of fruit flies (Drosophila) in real-time and then illuminate the flies depending on their positions and actions. Flies can be genetically manipulated so that such illumination is, e.g., a strong reward for them. Hence SkinnerTrax can automatically perform reinforcement learning experiments (it is named after B. F. Skinner), see our paper Learning a Spatial Task by Trial and Error in Drosophila. For more engineering details on SkinnerTrax, see SkinnerTrax: high-throughput behavior-dependent optogenetic stimulation of Drosophila.

Below a sample screenshot of SkinnerTrax in action training 20 flies, which get rewarded when they enter the circles. The right half of the image shows heatmaps, which are calculated in real-time and show the positional preferences of the flies.


The good

  • SkinnerTrax was written in pure Python, making the code relatively accessible.
  • SkinnerTrax is fast
    • this was achieved by the use of native libraries (OpenCV, etc.) and careful coding; SkinnerTrax's speed should be about the same as that of well-written native (e.g., C++) code.
    • this allows a large number of flies to be handled on a single machine (high throughput). E.g., one of our tracking machines uses 24 cameras tracking 2 flies per camera at 7.5 fps and 320x240 pixels and another machine uses 4 cameras tracking 20 flies per camera at 7.5 fps and 1280x720 pixels. The load on our i7 machines during such experiments is typically quite low -- e.g., only about 13% total CPU on an i7-4930K machine in the above 24-camera case -- when writing MJPEG-encoded videos (as record), but the machines become more highly loaded when using the better-compressing H.264. So the throughput numbers given here could be easily increased by using MJPEG or skipping video records.
    • this also makes the tracker more likely to actually achieve real-time response on "regular" (non-real-time) machines. E.g., we run it on Ubuntu machines, and analyzed 84002 reward events from 36 flies in 12 videos recorded at 7.5 fps, and for each reward event we detected the LEDs to be on in the video for the frame immediately following the frame when the stimulation rule was triggered. So for each of the 84002 reward events analyzed SkinnerTrax reacted in less than about 1/7.5 fps = 133 ms despite running on a non-real-time machine. (The corresponding experiments were performed using 4 cameras at 1280x720 pixels and H.264, which resulted in about 100% CPU for each of the 4 dual-thread-capable i7 cores on the machine.)
    • compared to Ctrax 0.5.6, a popular non-real-time tracker that is also Python- and OpenCV-based, SkinnerTrax was about 8x faster on one of our sample videos. (SkinnerTrax can also be used on prerecorded videos; unlike Ctrax, however, it was not designed to track more than one fly per arena.)
  • SkinnerTrax has several features to make handling many cameras easier
    • the positions of our arenas (and to a lesser extent their sizes) varied by camera; instead of hard-coding this, SkinnerTrax uses template matching to determine arena positions and sizes.
    • SkinnerTrax allows adjustment of settings by camera (e.g., exposure, focus, zoom level, etc.), storing the settings, and automatically uses the saved settings upon startup. Due to the large number of cameras we have in the lab, we used relatively inexpensive USB webcams (Microsoft LifeCam). The code for settings adjustment should work or be close to working for any UVC-compatible webcam (most are), but addresses a few issues that may be LifeCam specific. The USB webcams are identified by the port they are plugged into ("bus info" in the code) since, e.g., camera serial numbers appeared difficult to determine.
  • SkinnerTrax allows the LED controller hardware to be shared among several trackers and provides low-latency and high-throughput communication with the hardware
    • our hardware is Arduino- or Teensy-based, and only one process can own the communication with such microcontrollers. Since we run a separate SkinnerTrax process for each camera, to share the hardware among cameras, SkinnerTrax includes a TCP server (code) that is run as a separate process, handles the requests from all tracker processes to turn LEDs on (or off), and owns the communication with the microcontroller.
    • low latency and high throughput communication is achieved using the following tricks (and careful coding)
      • while the minimum USB latency for the Arduino Uno is about 4 ms, the latency for the Teensy 3.2 is typically less than 0.1 ms (kudos to Paul Stoffregen).
      • we wrote a fast library for the LED controller chip we use in our latest hardware (TLC59711), which can set 8 TLC chips (up to 96 channels) in 0.24 ms using a Teensy 3.2. The TCP server, in turn, on average needs only 0.26 ms to execute a "LED change" request. For hardware under development, we switched to the TLC5971, which may halve these times.
      • TCP connections are kept alive.
  • SkinnerTrax is multithreaded, which makes coding stimulation rules relatively easy
    • while multithreaded programming is hard in general, this complexity is almost fully hidden for the developer of SkinnerTrax stimulation rules. Furthermore, compared to other languages, multithreaded programming is relatively easy in Python.
    • each fly has its own thread, which makes writing a rule that, say, turns the fly's LEDs on after the fly performed a certain action and then turns the LEDs off 250 ms later relatively easy. We estimate that writing such rules without threads would result in much-harder-to-understand stimulation rule code.
    • for example, this code handles the stimulation rule for fly f for "area protocols," where the fly is rewarded with an onT-second pulse when it enters a certain area. The code is quite flexible, however, and can also deliver rewards at a certain rate while the fly is in the area; the required change is as simple as specifying a non-None offT.
  • SkinnerTrax includes several stimulation rules ("protocols") and various helper functions to make writing stimulation rules easy
    • the "area" protocol (code) was already mentioned in the previous point. Areas can be circular or rectangles.
    • the "move" protocol (code) stimulates whenever the fly is above or below a certain speed.
    • the "choice" protocol (code) stimulates at a certain rate depending on fly position. Unlike the area protocol, the choice protocol does not process area "enter" and "exit" events, but it also includes, e.g., exponential "interarrival" time distributions for stimulation.
    • the "open-loop" protocol (code) stimulates the two sides of an arena independently of fly behavior, e.g., alternating illumination between sides every 4 minutes.
  • SkinnerTrax includes real-time heatmaps
    • see the screenshot above for an example.
    • while a "proper" analysis of experiments is done via running (see Usage) on the files written by SkinnerTrax, real-time heatmaps are quite useful to get some early results while the experiment is running.
    • the overhead for real-time heatmap calculation was minimized (code).
  • SkinnerTrax writes detailed experimental records comprising multiple files
    • video (.avi): not strictly required for a real-time tracker, but nice to have a full record of the experiment. Newer versions of OpenCV support H.264 (which is a great codec and our default); previously, we used MJPEG.
    • trajectories (.trx)
    • "meta"data (.data): includes info on the stimulation rule, the frame numbers of when the LEDs were turned on and off, etc.
    • heatmaps (.png)

The neutral

  • SkinnerTrax was not designed to track multiple flies per arena. In fact, to reduce spurious detections (due to noise, reflections, etc.), it picks only one fly per arena.
  • We have never tried to use SkinnerTrax for animals other than flies, but see no reason why it should not work well for other animals. Even though the name "fly" is used in the code, the code is not fly specific.

The bad

  • our focus was primarily on fast, correct, highly reliable, and feature-rich code (and doing experiments), but not on making SkinnerTrax easily usable without any Python coding skills
    • the stimulation rules are directly coded in Python, so adjustments require code changes. (We typically save the real-time tracker code with the experiment files to have a record what code ran the experiment.)
    • while we have used the template matching code (see above) for 4 different chamber (arena) types, it requires changes for a new chamber type.
  • most of SkinnerTrax's features are accessible via command-line option only; the main purpose of the (minimal) GUI is to look at the ongoing experiment, including heatmaps. See Usage for how to run SkinnerTrax.
Clone this wiki locally
You can’t perform that action at this time.