Skip to content

atarixGB/br41n-io-hackathon-A2020

Repository files navigation

Hand Orthosis controlled by motor Imagery

The main goal of this project was to check if we were able to control a left hand orthosis using right cortex motor imagery.

This code was written for the BR41N.IO Toronto 2020 hackathon.

Training dataset

We used the EEG Motor Movement/Imagery Dataset in order to train our classifier. For this, we only used the task 2 and kept only the T0 and T1 annotations.

How it works

The exploration.ipynb was used to perform preprocessing, feature extraction and create classification models. We then saved our models using the joblib library.

Then, using the app.py file, you can run our live prototype. This does the preprocessing, feature extraction and classification to send prediction to an arduino. The arduino then open and close the orthosis hand according to the prediction that was made.

Test

The OpenBCI GUI was then used to stream Cyton data to our app using pyLSL. This playback file was used to stream data.

Going further

The accuracy of our classification was pretty low. To get a better control over the hand, we should work more on our feature extraction pipeline.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages