The main goal of this project was to check if we were able to control a left hand orthosis using right cortex motor imagery.
This code was written for the BR41N.IO Toronto 2020 hackathon.
We used the EEG Motor Movement/Imagery Dataset in order to train our classifier. For this, we only used the task 2 and kept only the T0 and T1 annotations.
The exploration.ipynb was used to perform preprocessing, feature extraction and create classification models. We then saved our models using the joblib library.
Then, using the app.py file, you can run our live prototype. This does the preprocessing, feature extraction and classification to send prediction to an arduino. The arduino then open and close the orthosis hand according to the prediction that was made.
The OpenBCI GUI was then used to stream Cyton data to our app using pyLSL. This playback file was used to stream data.
The accuracy of our classification was pretty low. To get a better control over the hand, we should work more on our feature extraction pipeline.