On metrics for measuring scanpath similarity
Ramin Fahimi, Neil D.B Bruce
https://link.springer.com/article/10.3758/s13428-020-01441-0
This repository contains an API for saliency prediction datasets along with most common evaluation metrics. This code will download required files from the website of the original publisher of datasets.
- Python (2.7, 3.4+)
- Python package manager (pip)
- Matlab (optional - required for some of the metrics.)
-
Clone the repository using the following command:
`git clone git@github.com:rAm1n/saliency.git`
or download a zip version from master.zip
-
Install required packages using pip
pip install -r requirements
-
Follow the tutorial: help.ipynb
At this moment, the following datasets are covered. I have plan to add more and complete this list. Some of them have other very useful annotations but given the variety of types, I have decided to not include external information at this point.
Datasets | Author | paper | Extra note |
---|---|---|---|
TORONTO | Neil Bruce, John K. Tsotsos. | Attention based on information maximization | |
CAT2000 | Ali Borji, Laurent Itti. | CAT2000: A Large Scale Fixation Dataset for Boosting Saliency Research | |
CROWD | Ming Jiang, Juan Xu, et al. | Saliency in Crowd | |
KTH | Gert Kootstra, Bart de Boer, et al. | Predicting Eye Fixations on Complex Visual Stimuli using Local Symmetry | |
OSIE | Juan Xu, Ming Jiang, et al. | Predicting Human Gaze Beyond Pixels | Object level attributes - mouse tracking |
MIT1003 | Tilke Judd, Krista Ehinger, et al. | Learning to Predict where Humans Look | |
LOWRES | Tilke Judd, Fredo Durand, et al. | Fixations on Low-Resolution Images | |
PASCAL-S | Yin Li , Xiaodi Hou , et al. | The Secrets of Salient Object Segmentation | Segmentation masks from VOC10 |
PASCAL-KYUN | Kiwon Yun, Yifan Peng, et al. | Studying Relationships Between Human Gaze, Description, and Computer Vision | Segmentation masks from VOC10 |
SUN09 | Kiwon Yun, Yifan Peng, et al. | Studying Relationships Between Human Gaze, Description, and Computer Vision | Segmentation masks from VOC10 |
SALICON | Ming Jiang, Shengsheng Huang, et al. | SALICON: Saliency in Context | Subset of MSCOCO |
EMOD | S. Fan, Z. Shen, et al. | Emotional Attention | emotion, object semantic categories, and high-level perceptual |
POET | Dim P. Papadopoulos, et al. | Pascal Objects Eye Tracking (POET) | Segmentation masks from VOC10(TODO) |
Metrics | Citation |
---|---|
AUC | Saliency and Human Fixations: State-of-the-Art and Study of Comparison Metrics |
SAUC | SUN: A Bayesian framework for saliency using natural statistics |
NSS | Components of bottom-up gaze allocation in natural scenes |
CC | Pearson's linear coefficient |
KLdiv | |
SIM | |
IG | Information-theoretic model comparison unifies saliency metrics |
Metric | Origin | |
---|---|---|
1 | Euclidean distance | |
2 | Mannan distance | The relationship between the locations of spatial features. |
3 | Eyeanalysis | A simple way to estimate similarity between pairs of eye movement |
4 | Levenshtein distance | Algorithms for defining visual regions-of-interest |
5 | ScanMatch | ScanMatch: A Novel Method for Comparing Fixation Sequences. |
6 | Hausdorff distance | Comparing images using the Hausdorff distance |
7 | Frechet distance | Computing discrete Fréchet distance |
8 | Dynamic time warp | Using dynamic time warping to find patterns in time series |
9 | Time delay embedding | Simulating human saccadic scanpaths on natural images |
10 | MultiMatch (5) | A Vector-based, Multidimensional Scanpath Similarity Measure. |
11 | Recurrence | Recurrence quantification analysis of eye movements |
12 | Determinism | Recurrence quantification analysis of eye movements |
13 | Laminarity | Recurrence quantification analysis of eye movements |
14 | CORM | Recurrence quantification analysis of eye movements |
Note: To make things run smoother, scanpaths has already been preprocessed and stored on dropbox. If you own one of the datasets and you don't like your data to be included in this package, please send a short message to fahimi72 At gmail and it will be taken care of. we do not own any of the data and the rights belong to the original publisher of the datasets. Please make sure to cite the appropriate paper if you are using them.