-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] Blink detection based on pupil size #12808
Comments
CC'ing devs who have been involved in eye-tracking: @larsoner @britta-wstnr @mscheltienne @sappelhoff @drammock What do you think about adding an eye-blink detection algorithm (for eyetracking signals)?
Yes if no license is listed then the author would need to re-license their work in BSD-3 or MIT, or give explicit permission for us to translate and re-license the code in BSD-3: #11747 (comment) Another idea 💡 the pypillometry package is MIT licensed and has implemented a blink detection method based off this paper by @smathot . There's also a couple other blink detection algorithms in there, one of which is based on pupil drop out (same as EyeLinks).
I think I would prefer In any case, if folks are up for this, we should probably demonstrate the ability of this algorithm on an open-access dataset? I can recommend a few and share code to read them into MNE. |
+1 sounds like a useful tool! |
That would be great! Especially also considering eye-tracking data from non SR-Research devices that may not have automatic blink locations shipped within the data.
+1
+1 |
Based on the devs who were tagged, it seems like we are +3 for adding this feature (I am not counting myself in this tally). Those who have not responded, please feel welcome to chime in if you have any votes/comments/concerns. @qian-chu how do you propose we (tentatively) move forward from here? |
Hello everyone, We can start with the simplest implementation of adding the function find_blinks with the method 'by_dropout' with which missing data segment from the pupils are considered a blink. This would be useful for other eyetracker which do not have any online parsers. Then, we can build up more methods (by slope, Mahot method...). Does that sound like a good plan to everyone? Note that we will also probably need to add a parameter called dropout_value, because I think for some eyetracker a 0 is missing data, while for others, Nan are. |
Sounds good, you can have argument for specific methods be provided as kwargs, similarly to what you do for |
Just to chip in: the latest version of our blink-detection-and-reconstruction algorithm is implemented here:
If you decide to use it, then ideally it would not be duplicated but rather use datamatrix as a dependency so that bug fixes don't need to be duplicated. The algorithm is a pretty ad-hoc affair that is the result of many years of dealing with edge cases. It works well but it's not very elegant. And every eye tracker needs it's own parameters. (The defaults are for the EyeLink.) |
Excellent suggestion. First get the infrastructure and API in place, then add more and more complex methods.
I agree that it would be nice to not have to duplicate bug fixes, however at the same time we need to consider the implications of adding a large dependency (of which the blink algorithm is a minor part) for a very small use-case (seen as relative to MNE-Python as a whole). I have one more point: More recent Tobii eye-trackers supply "eye openness data", which track the eyelid instead of the pupil, it is:
This is similar to a pupil size data channel, but (from my limited testing) is rarely (or never?) NaN. Theoretically, this should work even better for detecting blinks. If anyone has experience with this, I'd be curious to hear about it. The algorithms we plan to implement here should also work with this data, we'd just pass a particular channel to the method for annotating blinks. |
@AlexLepauvre @qian-chu any news on this? EDIT: I have opened a draft to further discuss the potential API: #12946 |
Describe the new feature or enhancement
The original Eyelink blink detection algorithm simply parses blinks when pupil becomes indetectable. However, this approach ignores the temporal dynamics of the opening/closing of eyelids. One existing solution is
mne.preprocessing.eyetracking.interpolate_blinks
which interpolates data in a fixed-duration buffer zone before and after a blink. The algorithm is effective but not adaptive enough to account for individual and trial-to-trial differences.Instead, a new method (https://doi.org/10.3758/s13428-017-1008-1) makes use of pupilometry noise and sharp slopes that precede and follow the blinks. The Python code is publically available at https://osf.io/jyz43/ thanks to @titoghose. The OSF repo has no indicated license, but we hope that's OK @titoghose if we adapt the code for MNE.
@AlexLepauvre and I can make a quick adaptation of the original code and make the algorithm available to everyone using MNE to analyze pupilometry data.
Describe your proposed implementation
New function
raw
is theRaw
object storing eye-tracking data that at least has one channel named 'pupil_*'.concat
controls whether to concat nearby blinks.'concat_gap_interval' determines the interval between successive missing samples/blinks to concatenate.
description
determines the annotation description for the newly detected blinks. If set to 'BAD_blink' will also replace the existing 'BAD_blink' annotations.The function will try to look for all 'pupil_*' channels and perform the detection. The detected blink events will be saved as annotations with
ch_names
set to the channel they were detected from.Describe possible alternatives
mne.preprocessing.eyetracking.interpolate_blinks
, but it lacks the flexibility to detect individual blinks.Additional context
No response
The text was updated successfully, but these errors were encountered: