Skip to content

modify self-attention model for EEG signal as input and image embedding layer as output

Notifications You must be signed in to change notification settings

redevaaa/Transformer-for-EEG

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EEG Transformers

Modified transformer network utilizing the attention mechanism for time series or any other numerical data. A collaborative 6.100 Electrical Engineering and Computer Science Project at MIT Media Lab.

The origianl NLP paper and Transformer model

publication of Google - https://arxiv.org/pdf/1706.03762.pdf

annotated transformer of Harvard NLP - http://nlp.seas.harvard.edu/2018/04/03/attention.html

Prerequisites

Partner

Yingqi Ding (@dyq0811) - co-author

Mentor

Neo Mohsenvand (@NeoVand) - idea and guidance

Mehul Smriti Raje (@mraje16) - EEG preprocessing

To know about our project and see the performance

final_report.pdf - the completed presentation of the project

To understand the code

code_explanation.pdf - all the functions are explained piece by piece

To train the model

EEG_train.ipynb - a training and prediction example for the EEG (Electroencephalogram) dataset

LDS_train.ipynb - a training and prediction example for the GLDS (gaussian linear dynamical systems) dataset

About

modify self-attention model for EEG signal as input and image embedding layer as output

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published