This repository contains the data and classification algorithms used and developed for the article Hooge, I.T.C., Niehorster, D.C., Nyström, M., Andersson, R. & Hessels, R.S. (2022). Fixation classification: how to merge and select fixation candidates. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01723-1. For each of the algorithms in the classification algorithms directory, a file showing example use is included in the same directory. Run the file called run* to test run the algorithm.
When using the classification algorithms or eye-tracking data in this repository in your work, please cite the Hooge et al. (2022) paper.
Each of the algorithms has been modified by us. Therefore, if you use one of the algorithms, please cite both the original paper describing the algorithm and the Hooge et al. (2022) paper for which the modified versions were developed. Specifically, for each of the algorithms, the following references apply (full references are listed below):
Code directory | Citation |
---|---|
CDT | Veneri et al. (2011); Hooge et al. (2022) |
HC2013 | Hooge & Camps (2013); Hooge et al. (2022) |
I2MC | Hessels et al. (2016); Hooge et al. (2022) |
I2MW | Hooge et al. (2022) |
KF | Komogortsev et al. (2010); Hooge et al. (2022) |
MST | Komogortsev et al. (2010); Hooge et al. (2022) |
NH2010 | Nyström & Holmqvist (2010); Hooge et al. (2022) |
For more information or questions, e-mail: i.hooge@uu.nl / dcnieho@gmail.com. The latest version of this repository is available from https://github.com/dcnieho/HoogeetalSelectionRules
The classification algorithms in this repository are licensed under the Creative Commons Attribution 4.0 (CC BY 4.0) license. The eye-tracking data are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 (CC NC-BY-SA 4.0) license.
Tested on MATLAB R2016b & R2021a
N.B.: complete details of changes made is available on github.
- Initial release.
Hooge, I.T.C., Niehorster, D.C., Nyström, M., Andersson, R. & Hessels, R.S. (2022). Fixation classification: how to merge and select fixation candidates. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01723-1
Hessels, R. S., Niehorster, D. C., Kemner, C., & Hooge, I. T. C. (2016). Noise-robust fixation detection in eye movement data: Identification by two-means clustering (I2MC). Behavior Research Methods, pp 1-22. https://doi.org/10.3758/s13428-016-0822-1.
Hooge, I. T. C., & Camps, G. (2013). Scan path entropy and arrow plots: Capturing scanning behavior of multiple observers. Frontiers in Psychology, 4, 996. https://doi.org/10.3389/fpsyg.2013.00996
Komogortsev, O. V., Gobert, D. V., Jayarathna, S., Koh, D. H., & Gowda, S. M. (2010). Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Transactions on Biomedical Engineering, 57(11), 2635-2645.
Nyström, M., & Holmqvist, K. (2010). An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behavior Research Methods, 42(1), 188-204.
Veneri, G., Piu, P., Rosini, F., Federighi, P., Federico, A., & Rufa, A. (2011). Automatic eye fixations identification based on analysis of variance and covariance. Pattern Recognition Letters, 32, 1588-1593. https://doi.org/10.1016/j.patrec.2011.06.012
In the following, "data" refers to both the eye-tracking data and source code included in this repository. By downloading this data set, you expressly agree to the following conditions of release and acknowledge the following disclaimers issued by the authors:
Code is available by permission of the authors. Use of code in publications or for the preparation of publications, either digital or hardcopy, must be cited as follows: Hooge, I.T.C., Niehorster, D.C., Nyström, M., Andersson, R. & Hessels, R.S. (2022). Fixation classification: how to merge and select fixation candidates. Behavior Research Methods. https://doi.org/10.3758/s13428-021-01723-1 Furthermore, you must also cite the original paper in which the code was presented, if applicable. Refer to the table above for which references apply to particular code.
The authors shall not be held liable for any improper or incorrect use or application of the data provided, and assume no responsibility for the use or application of the data or interpretations based on the data, or information derived from interpretation of the data. In no event shall the authors be liable for any direct, indirect or incidental damage, injury, loss, harm, illness or other damage or injury arising from the release, use or application of these data. This disclaimer of liability applies to any direct, indirect, incidental, exemplary, special or consequential damages or injury, even if advised of the possibility of such damage or injury, including but not limited to those caused by any failure of performance, error, omission, defect, delay in operation or transmission, computer virus, alteration, use, application, analysis or interpretation of data.
No warranty, expressed or implied, is made regarding the accuracy, adequacy, completeness, reliability or usefulness of any data provided. These data are provided "as is." All warranties of any kind, expressed or implied, including but not limited to fitness for a particular use, freedom from computer viruses, the quality, accuracy or completeness of data or information, and that the use of such data or information will not infringe any patent, intellectual property or proprietary rights of any party, are disclaimed. The user expressly acknowledges that the data may contain some nonconformities, omissions, defects, or errors. The authors do not warrant that the data will meet the user's needs or expectations, or that all nonconformities, omissions, defects, or errors can or will be corrected. The authors are not inviting reliance on these data, and the user should always verify actual data.