Skip to content

Exploring how to infer VR users' affective states from their EEG activity in real-time. We used supervised learning and conducted the experiment in Virtual Reality. Two feature selection methods are compared.

License

Notifications You must be signed in to change notification settings

aepinilla/affect_detection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Affect detection

Description

This repository contains the scripts used in this research paper. Please feel free to use it for any academic or commercial projects, and consider citing the paper:

Pinilla, A., Voigt-Antons, J. N., Garcia, J., Raffe, W., & Möller, S. (2023). Real-time affect detection in virtual reality: a technique based on a three-dimensional model of affect and EEG signals. Frontiers in Virtual Reality, 3, 964754. https://doi.org/10.3389/frvir.2022.964754

@article{pinilla_real-time_2023,
	title = {Real-time affect detection in virtual reality: a technique based on a three-dimensional model of affect and {EEG} signals},
	volume = {3},
	issn = {2673-4192},
	shorttitle = {Real-time affect detection in virtual reality},
	url = {https://www.frontiersin.org/articles/10.3389/frvir.2022.964754/full},
	doi = {10.3389/frvir.2022.964754},
	urldate = {2024-01-29},
	journal = {Frontiers in Virtual Reality},
	author = {Pinilla, Andres and Voigt-Antons, Jan-Niklas and Garcia, Jaime and Raffe, William and Möller, Sebastian},
	month = jan,
	year = {2023},
	pages = {964754},
	file = {Full Text:/Users/aepinilla/Zotero/storage/XF5NBVSX/Pinilla et al. - 2023 - Real-time affect detection in virtual reality a t.pdf:application/pdf},
}

NOTE: The preprocessing pipeline used for this experiment was originally implemented in Matlab, using EEGLAB. A more recent Python implementation is available here.

Instructions

  1. Install R (https://cran.r-project.org/)
  2. Clone this repository:
git clone git@github.com:aepinilla/affect_detection.git
  1. Download the 'data.zip' file from the OSF repository of the study: https://osf.io/7v9kt/
  2. Unzip data.zip and place it at the root of the folder you just cloned.
  3. Using the terminal, go to the root of the 'affect_detection' folder and run the following command:
python main.py

Preprocessing

The 'data' folder contains data that has been already preprocessed. To replicate the preprocessing steps, follow these instructions:

  1. Install Matlab.
  2. Install EEGLAB following these instructions: https://eeglab.org/tutorials/01_Install/Install.html
  3. Transform XDF files to CSV for faster processing:
python xdf_to_csv.py
  1. Open EEGLAB in Matlab and run the preprocessing script located at affect_detection/src/preprocessing.m

Reports

All files generated by main.py will be stored in the 'reports' folder. In a MacBook Pro, it took several hours to run the program. If you want to skip that, download the 'reports.zip' file available in the OSF repository: https://osf.io/7v9kt/

About

Exploring how to infer VR users' affective states from their EEG activity in real-time. We used supervised learning and conducted the experiment in Virtual Reality. Two feature selection methods are compared.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published