pikinder / unupervised-bci

Unsupervised Event-Related Potential Brain-Computer Interfaces

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

unupervised-bci

This repository contains code implementing an unsupervised decoder for Event-Related Potential based Brain-Computer Interfaces. The following methods are included:

  1. Unsupervised EM [1]. A video of our setup used in the demonstration at NIPS 2012 is available here. It shows a randomly initialised decoder that learns how to interpret the user's brain signals without supervision.
  2. Learning from Label Proportions [4] based decoding [5]. A basic notebook doing this in a batch setting is proveded (experiment_llp_basic.ipnyb). It has to be integrated in the main framework to simulate an online experiment.
  3. A supervised baseline using shrinkage LDA.

Usage

  1. Download and preprocess the data by running setup.sh
  2. run one of the included ipython notebooks

Experiments

experiment_amuse_batch.py

This experiment loads the online data from a single subject. It gives the unsupervised classifier access to all data (without labels) and performs several update iterations, in each iteration the selection accuracy and single trial accuracy are printed. It is also compared to a supervised LDA classifier with analytic regularisation. The EM classifier does not always converge to a good solution and a restart might be required. Tricks to address this issue are discussed in [1].

Datasets

The repository contains code to download (and if needed pre-process) the following datasets.

  • AMUSE dataset: Auditory 6 class ERP based BCI. The dataset belongs to Schreuder et al [3]. This dataset can only be used with an EM decoder.
  • (work in progress) Learning from Label Proportions BCI. This is the data from our LLP-BCI experiments belonging to Hübner et al [5]. It can be used with an LLP decoder and an EM decoder. Please cite the respective papers when these datasets are used.

Code was tested using:

  • python 2.7.12
  • sklearn 0.18.1
  • numpy 1.12.1
  • scipy 0.19.0

Acknowledgement

Most of the code was written when I was supported by the EU on a Marie-Curie Fellowship: 657679.

References

  1. Kindermans et al. A bayesian model for exploiting application constraints to enable unsupervised training of a P300-based BCI, (2012) PLoS One
  2. Kindermans et al. A P300 BCI for the masses: prior information enables instant unsupervised spelling, (2012) Neural Information Processing Systems (NIPS)
  3. Schreuder et al. Listen, you are writing! Speeding up online spelling with a dynamic auditory BCI (2011) Frontiers in Neuroscience
  4. Quadrianto et al. Estimating Labels from Label Proportions, (2009) JMLR
  5. Hübner et al. Learning from Label Proportions in Brain-Computer Interfaces: Online Unsupervised Learning with Guarantees, (2017) PLoS One

About

Unsupervised Event-Related Potential Brain-Computer Interfaces


Languages

Language:Python 60.4%Language:Jupyter Notebook 38.0%Language:Shell 1.6%