Skip to content

MHersche/bci-model-superpos

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Copyright (c) 2019 ETH Zurich, Michael Hersche

Compressing Subject-specific Brain--Computer Interface Models into One Model by Superposition in Hyperdimensional Space

In this repository, we share the code for compressing subject-specific BCI models.
For details, please refer to the papers below.

If this code proves useful for your research, please cite

Michael Hersche, Philipp Rupp, Luca Benini, Abbas Rahimi, "Compressing Subject-specific Brain--Computer Interface Models into One Model by Superposition in Hyperdimensional Space", in ACM/IEEE Design, Automation, and Test in Europe Conference (DATE), 2020.

Installing Dependencies

You will need a machine with a CUDA-enabled GPU and the Nvidia SDK installed to compile the CUDA kernels. Further, we have used conda as a python package manager and exported the environment specifications to conda-env-bci-superpos.yml. You can recreate our environment by running

conda env create -f conda-env-bci-superpos.yml -n myBCIsupposEnv 

Make sure to activate the environment before running any code.

Download the BCI competition IV 2a dataset

EEGNet: Download the .mat files of 4-class MI dataset with 9 subjects (001-2014) from here, unpack it, and put into folder dataset/EEGNet

ShallowConvnet: Download .gdf files from here by requesting access under "Download of data sets". You'll receive an account and can download files. Then put them into folder dataset/shallowconvnet. The labels need to be downloaded seperately also here under "True Labels of Competition's Evaluation Sets".

Step-by-step Guide

There are two networks to test the compression -- EEGNet and Shallow ConvNet. You can test them by running main.py either in code/EEGnet/ or code/ShallowConvNet/. The original and compressed models are stored in the corresponding models/ folder. Accuracy results are available in results/.

License and Attribution

Please refer to the LICENSE file for the licensing of our code.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages