Interactive desktop application for labeling eye movement events (saccades, microsaccades, PSO) and training U'n'Eye deep learning models for automatic saccade detection.
Built with PyQt6 and pyqtgraph.
- Load eye position data from CSV, MAT (v7.3), or NumPy files
- Interactive time series viewer with synchronized eye position (X+Y overlaid) and velocity plots
- Click-drag to select time regions, press Space/Enter to apply labels
- User-definable event classes with custom names, colors, and keyboard shortcuts
- Right-click on labeled segments to change class
- Double-click to erase a labeled segment
- Undo/redo for all label edits
- Labels auto-save to CSV alongside your data files
- Gaussian smoothing and 5-point median filter toggles
- Full Training: train on all labeled data, save weights
- K-fold Cross-Validation: honest generalization metrics with per-fold loss curves
- Live loss curve visualization (overlaid per fold in CV mode)
- Per-class metrics: F1, precision, recall, and support for each event class
- Option to collapse multi-class labels to binary (saccade vs fixation) for training
- Configurable: learning rate, max epochs, data augmentation, min saccade duration/distance
- "Retrain on All Data" button after CV for final deployment model
- Run predictions with any trained weights file
- Per-class metrics table (F1, precision, recall per class)
- Overall metrics: Cohen's kappa, accuracy, weighted F1
- Onset/offset error histograms
- Confusion browser: navigate trials with prediction errors
- Sidebar with three views: Label, Train, Evaluate
- Keyboard shortcuts: Left/Right arrows for trial navigation, 1-9 for class selection
- Trackpad support: horizontal swipe to pan, vertical scroll to zoom time axis
- Auto-loads last dataset on startup
conda create -n gazeannot python=3.11 -y
conda activate gazeannotpip install PyQt6 pyqtgraph numpy scipy pandas mat73 torch scikit-learn scikit-image matplotlibgit clone https://github.com/yourusername/gazeannot.git
cd gazeannot
python main.py- Launch:
python main.py - Load data: File > Open Data (Ctrl+O) — select your eye X file, Y file is auto-detected
- Label: Drag on the plot to select a time range, press Space to apply the active class
- Switch classes: Press 1 (Saccade), 2 (Microsaccade), 3 (PSO), or click the buttons
- Navigate: Left/Right arrow keys to browse trials
- Train: Click "Train Model" or switch to Train view via the sidebar
GazeAnnot expects eye position data as:
| Format | Structure |
|---|---|
| CSV | Rows = trials, columns = timepoints. Provide separate X and Y files (e.g., eye_x.csv, eye_y.csv). Y file is auto-detected from X filename. |
| MAT | MATLAB v7.3 files. A variable picker dialog lets you select which variables are X and Y. |
| NPY/NPZ | NumPy arrays. Same structure as CSV (trials x timepoints). |
- Sampling frequency: 1000 Hz (default)
- Units: degrees of visual angle
- Labels: integer per timepoint (0 = unlabeled/fixation, 1 = saccade, 2 = microsaccade, etc.)
When loading data, GazeAnnot searches for label files in the same directory:
<prefix>_labels.csvor<prefix>_binary_labels.csvlabels.csvorbinary_labels.csv
Labels are auto-saved to CSV whenever you navigate between trials.
| Key | Action |
|---|---|
Ctrl+O |
Open data |
Ctrl+S |
Save labels |
Ctrl+Z / Ctrl+Shift+Z |
Undo / Redo |
Left / Right |
Previous / Next trial |
1-9 |
Select event class |
Space / Enter |
Apply label to selection |
Escape |
Clear selection |
G |
Toggle Gaussian smooth |
M |
Toggle median filter |
Ctrl+1 / 2 / 3 |
Switch to Label / Train / Eval view |
| Action | Effect |
|---|---|
| Left-drag | Create selection region |
| Single click | Clear selection |
| Double-click on label | Erase that labeled segment |
| Right-click on label | Change class (context menu) |
| Scroll up/down | Zoom time axis |
| Two-finger swipe left/right | Pan in time |
The data/ directory contains sample eye tracking data (54 trials at 1000 Hz) for testing.
If you use GazeAnnot in your research, please cite the U'n'Eye model:
Bellet, M.E., Bellet, J., Nienborg, H., Hafed, Z.M., & Berens, P. (2019). Human-level saccade detection performance using deep neural networks. Journal of Neurophysiology, 121(2), 646-661.
MIT License. See LICENSE.
The bundled U'n'Eye model code is by Bellet et al. — see uneye/LICENSE_NOTICE.md.

