In this project I work with eye-tracking data (eye-movements) and compare different approaches for detecting patterns in the trajectories of the movements:
- convert data into strings and detect patterns in strings,
- represent data as a set of time sequences, train convolutional network and detect pattern using filters of the first conv layer.
Required libraries:
numpy
pandas
keras
prepare_data.ipynb
→ Preprocess raw data and compute basic features for detecting patterns in trajectories:
- movement direction
- movement length
patterns_strings.ipynb
→ Factorize features and convert factors into sequences. Find patterns in sequences and visualize them.
patterns_conv.ipynb
→ Convert data into time sequences, train a network with 1D Convolutional layers to detect the task user is performing. Extract weights from the first layer and check what kind of patterns are detected by them.
The demo data is represented by eye-tracker observations of users doing some routine web-surfing tasks such as:
- reading Wikipedia,
- checking news,
- searching a route on maps,
- watching YouTube,
- checking a social network (vk)
- or going through Pinterest search results.
While performing the tasks, their eye-movements were recorded: x
, y
variables denote gaze position on a screen, time
determines time from the beginning of the recording to the current observation.
Position (0, 0)
means the left upper corner of the screen.
The data was cleaned: only fixations were left and observations where a blink was detected were removed.
A slice of the data:
Examples of patterns, detected from strings:
The comparison of the pattern's coverage in real and permuted data:
Highlighting the areas of a sequence which activates a filter in conv_1 layer:
Simulate data and plot heatmaps of gaze positions to understand the filter's properties (filter size equals 3):
Plot the best stimuli for the filter from the simulated data:
- Katerina Malakhova - taneta
This project is licensed under the MIT License - see the LICENSE.md file for details