Skip to content

Annotating ICA components visually

Adam Li edited this page May 9, 2022 · 4 revisions

Summary

When running Independent Component Analysis (ICA), one often has the challenge of annotating which ICA components to exclude (if they are deemed part of a heart beat, eye blink, muscle artifact, line noise, or other reasons) vs which components to keep (because they are deemed brain signal). This annotation process is considered highly subjective since it relies on the user. mne-icalabel is attempting to make the process more systematic by relying on a machine learning model, but the gold-standard as of now is still a human annotation.

How to annotate?

Here is an excellent tutorial from UCSD, where there is a walk-through guide on how to annotate different ICA components based on various visual features of the IC signal.

https://labeling.ucsd.edu/tutorial/labels

Here is a web-app framework also from UCSD, where one can practice labeling real ICA components.

https://labeling.ucsd.edu/labelfeedback

References

Clone this wiki locally