Music imagery information retrieval (MIIR) systems may one day be able to recognize a song just as we think of it. As a step towards such technology, we are presenting a public domain dataset of electroencephalography (EEG) recordings taken during music perception and imagination. We acquired this data during an ongoing study that so far comprised 10 subjects listening to and imagining 12 short music fragments - each 7s-16s long - taken from well-known pieces. These stimuli were selected from different genres and systematically span several musical dimensions such as meter, tempo and the presence of lyrics. This way, various retrieval and classification scenarios can be addressed. The dataset is primarily aimed to enable music information retrieval researchers interested in these new MIIR challenges to easily test and adapt their existing approaches for music analysis like fingerprinting, beat tracking or tempo estimation on this new kind of data. We also hope that the OpenMIIR dataset will facilitate a stronger interdisciplinary collaboration between music information retrieval researchers and neuroscientists.
Obtaining the Raw EEG Data
This git repository does not contain the raw EEG data, which is around 700 MB per subject adding up to several gigabytes in total. There are currently two ways to obtain the data:
download via http from one of the sites listed below:
download via bittorrent tracked by academic torrents
We strongly encorage the second approach as it allows for distributed sharing.
In order to enable everybody to work with this data, we decided to share it in a format that does not require any commercial software for loading and processing. Specifically, the raw EEG is saved in the FIF file format used by MNE and MNE-Python.
This data format can, for instance, also be easily converted into the MAT format used by Matlab, which allows importing into EEGLab. A description on how to do this can be found in the wiki.
A first presentation about this dataset was given at NEMISIG 2015 and can be downloaded here. Furthermore, there is information about labs using this dataset and related publications in the repository wiki. Please contact us, if you would like to be added.
You are openly invited to contribute to this dataset. There are several possibilities to do this:
- add more subjects by running the experiment yourself
- host an http download mirror or seed the dataset torrent to provide download bandwidth
- run your own experiments on the dataset and share your results
If you want to contribute, please contact us.
License and Citations
OpenMIIR is released under the Open Data Commons Public Domain Dedication and Licence (PDDL), which means that you can freely use it without any restrictions.
If you use the OpenMIIR dataset in published research work, we would appreciate if you would cite this article: Sebastian Stober, Avital Sternin, Adrian M. Owen and Jessica A. Grahn: "Towards Music Imagery Information Retrieval: Introducing the OpenMIIR Dataset of EEG Recordings from Music Perception and Imagination." In: Proceedings of the 16th International Society for Music Information Retrieval Conference (ISMIR’15), pages 763-769, 2015.
This dataset is a result of ongoing joint work between the Owen Lab and the Music and Neuroscience Lab at the Brain and Mind Institute of the University of Western Ontario. It has been supported by a fellowship within the Postdoc-Program of the German Academic Exchange Service (DAAD), the Canada Excellence Research Chairs (CERC) Program, an National Sciences and Engineering Research Council (NSERC) Discovery Grant and an Ontario Early Researcher Award.
Sebastian Stober <sstober AT uni-potsdam DOT de>
Machine Learning in Cognitive Science Lab
Research Focus Cognitive Sciences
University of Potsdam, Germany