Neu-Robin1993 / Transformer-for-EEG

modify self-attention model for EEG signal as input and image embedding layer as output

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EEG Transformers

Modified transformer network utilizing the attention mechanism for time series or any other numerical data. A collaborative 6.100 Electrical Engineering and Computer Science Project at MIT Media Lab.

The origianl NLP paper and Transformer model

publication of Google - https://arxiv.org/pdf/1706.03762.pdf

annotated transformer of Harvard NLP - http://nlp.seas.harvard.edu/2018/04/03/attention.html

Prerequisites

Partner

Yingqi Ding (@dyq0811) - co-author

Mentor

Neo Mohsenvand (@NeoVand) - idea and guidance

Mehul Smriti Raje (@mraje16) - EEG preprocessing

To know about our project and see the performance

final_report.pdf - the completed presentation of the project

To understand the code

code_explanation.pdf - all the functions are explained piece by piece

To train the model

EEG_train.ipynb - a training and prediction example for the EEG (Electroencephalogram) dataset

LDS_train.ipynb - a training and prediction example for the GLDS (gaussian linear dynamical systems) dataset

About

modify self-attention model for EEG signal as input and image embedding layer as output


Languages

Language:Python 55.2%Language:Jupyter Notebook 44.8%