You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
It would be helpful to expose the classifier decision in a variety of ways and means so that the EEGEdu prediction backend could be used as a controller for other output tasks.
Describe the solution you'd like
add some sort of simple interface to collect short (1s?) measurements for user-defined states. The experimenter could add as many states as desired, and for each state, store as many EEG snippets as wanted. Then at test time let the user try to invoke one of those states (using a nearest neighbour classifier) and expose the output of the classifier as a MIDI message. This way the experimenter could try to train a brain-wave based musical instrument, in the spirit of Rebecca Fiebrink's work, or a controller.
Describe alternatives you've considered
Currently there is ability to predict brain states, but the output is limited to the visual playground. One potential interface-centric library for making such MIDI interaction straightforward would be MIDI.js
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
It would be helpful to expose the classifier decision in a variety of ways and means so that the EEGEdu prediction backend could be used as a controller for other output tasks.
Describe the solution you'd like
add some sort of simple interface to collect short (1s?) measurements for user-defined states. The experimenter could add as many states as desired, and for each state, store as many EEG snippets as wanted. Then at test time let the user try to invoke one of those states (using a nearest neighbour classifier) and expose the output of the classifier as a MIDI message. This way the experimenter could try to train a brain-wave based musical instrument, in the spirit of Rebecca Fiebrink's work, or a controller.
Describe alternatives you've considered
Currently there is ability to predict brain states, but the output is limited to the visual playground. One potential interface-centric library for making such MIDI interaction straightforward would be MIDI.js
The text was updated successfully, but these errors were encountered: