Skip to content

weigertlab/trackastra

Repository files navigation

Optimus Prime

Trackastra - Tracking by Association with Transformers

Trackastra is a cell tracking approach that links already segmented cells in a microscopy timelapse by predicting associations with a transformer model that was trained on a diverse set of microscopy videos.

Overview

If you are using this code in your research, please cite our paper:

Benjamin Gallusser and Martin Weigert
Trackastra - Transformer-based cell tracking for live-cell microscopy
European Conference on Computer Vision, 2024

Examples

Nuclei tracking Bacteria tracking
deepcell_results.mp4
bacteria_results.mp4

Installation

This repository contains the Python implementation of Trackastra.

Please first set up a Python environment (with Python version 3.10 or higher), preferably via conda or mamba.

Simple installation

Trackastra can then be installed from PyPI using pip:

pip install trackastra

With ILP support

For tracking with an integer linear program (ILP, which is optional)

conda create --name trackastra python=3.10 --no-default-packages
conda activate trackastra
conda install -c conda-forge -c gurobi -c funkelab ilpy
pip install "trackastra[ilp]"
πŸ“„

Development installation

conda create --name trackastra python=3.10 --no-default-packages
conda activate trackastra
conda install -c conda-forge -c gurobi -c funkelab ilpy
git clone https://github.com/weigertlab/trackastra.git
pip install -e "./trackastra[ilp,dev]"
πŸ“„

Notes/Troubleshooting

  • For the optional ILP linking, this will install motile and binaries for two discrete optimizers:

    1. The Gurobi Optimizer. This is a commercial solver, which requires a valid license. Academic licenses are provided for free, see here for how to obtain one.

    2. The SCIP Optimizer, a free and open source solver. If motile does not find a valid Gurobi license, it will fall back to using SCIP.

  • On MacOS, installing packages into the conda environment before installing ilpy can cause problems.

  • 2024-06-07: On Apple M3 chips, you might have to use the nightly build of torch and torchvision, or worst case build them yourself.

Usage: Tracking with a pretrained model

The input to Trackastra is a sequence of images and their corresponding cell (instance) segmentations.

demo

The available pretrained models are described in detail here.

Tracking with Trackastra can be done via:

icon Napari plugin

For a quick try of Trackastra on your data, please use our napari plugin, which already comes with pretrained models included.

icon Python API

All you need are the following two numpy arrays:

  • imgs: a microscopy time lapse of shape time,(z),y,x.
  • masks: corresponding instance segmentation of shape time,(z),y,x.

The predicted associations can then be used for linking with several modes:

  • greedy_nodiv (greedy linking with no division) - fast, no additional dependencies
  • greedy (greedy linking with division) - fast, no additional dependencies
  • ilp (ILP based linking) - slower but more accurate, needs motile

Apart from that, no hyperparameters to choose :)

πŸ“„ Show python example
import torch
from trackastra.model import Trackastra
from trackastra.tracking import graph_to_ctc, graph_to_napari_tracks
from trackastra.data import example_data_bacteria

device = "automatic" # explicit choices: [cuda, mps, cpu]

# load some test data images and masks
imgs, masks = example_data_bacteria()

# Load a pretrained model
model = Trackastra.from_pretrained("general_2d", device=device)

# or from a local folder
# model = Trackastra.from_folder('path/my_model_folder/', device=device)

# Track the cells
track_graph = model.track(imgs, masks, mode="greedy")  # or mode="ilp", or "greedy_nodiv"


# Write to cell tracking challenge format
ctc_tracks, masks_tracked = graph_to_ctc(
      track_graph,
      masks,
      outdir="tracked",
)

You then can visualize the tracks with napari:

# Visualise in napari
napari_tracks, napari_tracks_graph, _ = graph_to_napari_tracks(track_graph)

import napari
v = napari.Viewer()
v.add_image(imgs)
v.add_labels(masks_tracked)
v.add_tracks(data=napari_tracks, graph=napari_tracks_graph)

icon Fiji (via TrackMate)

Trackastra is one of the available trackers in TrackMate. For installation and usage instructions take a look at this tutorial.

icon Docker images

Some of our models are available as docker images on Docker Hub. Currently, we only provide CPU-based docker images.

Track within a docker container with the following command, filling the <VARIABLES>:

docker run -it -v <LOCAL_DATA_DIR>:/data -v <LOCAL_RESULTS_DIR>:/results bentaculum/trackastra-track:<MODEL_TAG> --input_test /data/<DATASET_IN_CTC_FORMAT> --detection_folder <TRA/SEG/ETC>"
πŸ“„ Show example with Cell Tracking Challenge model:
wget http://data.celltrackingchallenge.net/training-datasets/Fluo-N2DH-GOWT1.zip 
chmod -R 775 Fluo-N2DH-GOWT1
docker pull bentaculum/trackastra-track:model.ctc-linking.ilp 
docker run -it -v ./:/data -v ./:/results bentaculum/trackastra-track:model.ctc-linking.ilp --input_test data/Fluo-N2DH-GOWT1/01 --detection_folder TRA

icon Command Line Interface

After installing Trackastra, simply run in your terminal
trackastra track --help

to build a command for tracking directly from images and corresponding instance segmentation masks saved on disk as two series of TIF files.

Usage: Training a model on your own data

To run an example

  • clone this repository and got into the scripts directory with cd trackastra/scripts.
  • download the Fluo-N2DL-HeLa dataset from the Cell Tracking Challenge into data/ctc.

Now, run

python train.py --config example_config.yaml

Generally, training data needs to be provided in the Cell Tracking Challenge (CTC) format, i.e. annotations are located in a folder containing one or several subfolders named TRA, with masks and tracklet information.

About

Transformer-based cell tracking for live-cell microscopy

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages