Welcome to the official repository for our paper "A Progressive Multi-Domain Adaptation Network with Reinforced Self-Constructed Graphs for Cross-Subject EEG-Based Emotion and Consciousness Recognition", published in IEEE Transactions on Neural Systems and Rehabilitation Engineering (TNSRE), 2025! π
This repository contains the complete implementation of our innovative framework for cross-subject EEG-based emotion and consciousness recognition. π§ β¨
Our project introduces a cutting-edge Progressive Multi-Domain Adaptation Network that leverages reinforced self-constructed graphs to address domain shift and subject variability in EEG data. This approach enhances the robustness and accuracy of emotion and consciousness recognition across subjects, paving the way for advanced neural systems applications. π
To get started, ensure you have the following:
- π Python 3.8 or higher
- π¦ Required Python packages :
- NumPy
- PyTorch
- SciPy
- Scikit-learn
- Pandas
- ......
- π Datasets: SEED and SEED-IV
- π» A computing environment (GPU recommended for faster training β‘)
-
Clone this repository to your local machine:
git clone https://github.com/your-username/your-repo-name.git cd your-repo-name -
Install the required dependencies:
pip install -r requirements.txt
This project uses the SEED and SEED-IV datasets for EEG-based emotion recognition. Follow these steps to prepare the data:
- π₯ Download the SEED and/or SEED-IV datasets from their official sources.
- π Place the extracted dataset files (DE) in the appropriate directory.
- β Ensure the dataset files are correctly formatted and accessible for preprocessing.
Follow these steps to run the experiments and reproduce our results:
-
Create a Result Directory π In the project root directory, create an empty folder named
resultto store outputs:mkdir result
-
Preprocess the Data π οΈ Run the preprocessing script to partition the dataset into source, target, and mixed domains:
python datapipe.py
Note: Update the dataset paths in
datapipe.pyto match your SEED/SEED-IV data locations. -
Run the Main Script
βΆοΈ Execute the main script to train and evaluate the model:python main.py
The script will handle model training, evaluation, and save results to the
resultdirectory.
- π Training logs and model checkpoints will be saved in the
./resultdirectory. - π Evaluation metrics (e.g., accuracy, F1-score) for emotion and consciousness recognition will be logged and saved as summary files.
If you find our work inspiring or use this code, please cite our paper:
@ARTICLE{11142795,
author={Chen, Rongtao and Xie, Chuwen and Zhang, Jiahui and You, Qi and Pan, Jiahui},
journal={IEEE Transactions on Neural Systems and Rehabilitation Engineering},
title={A Progressive Multi-Domain Adaptation Network With Reinforced Self-Constructed Graphs for Cross-Subject EEG-Based Emotion and Consciousness Recognition},
year={2025},
volume={33},
pages={3498-3510},
keywords={Brain modeling;Emotion recognition;Electroencephalography;Accuracy;Feature extraction;Adaptation models;Probability distribution;Noise measurement;Emotional responses;Computational modeling;Electroencephalogram (EEG);emotion recognition;consciousness recognition;domain adaptation;reinforcement learning},
doi={10.1109/TNSRE.2025.3603190}}For questions, feedback, or issues, please:
- π§ Reach out to chenrongtao@m.scnu.edu.cn.
Thank you for exploring our work! We hope this repository sparks innovation and advances your research journey! π