Skip to content

Serg99io/DSTF-sleepstaging

Repository files navigation

A novel dual-stream time-frequency contrastive pretext tasks framework for sleep stage classification

This repository contains the implementation of the paper "A novel dual-stream time-frequency contrastive pretext tasks framework for sleep stage classification". In this project we aim at introducing a dual-stream pretext task architecture that operates both in the time and frequency domains. In particular, we have examined the incorporation of the novel Frequency Similarity (FS) pretext task into two existing pretext tasks, Relative Positioning (RP) and Temporal Shuffling (TS). The original paper can be found at: https://arxiv.org/pdf/2312.09623.pdf.
Below the architecture for pre-training utilized in the paper is shown.

Below the Dual-stream time-frequency framework with RP/TS and FS for the downstream task is shown.

This is tested on the downstream task sleep staging. A significant improvement in the accuracy of the downstream task was found for both pretext tasks with the assistance of the proposed pretext task.

Data

The dataset used in this paper is a subset of the Physionet Challenge 2018 EEG dataset, specifically the 994 participants from the training set with available sleep annotations. You can access the dataset at the following link: Physionet Challenge 2018 EEG dataset. Please download the complete training dataset. Ensure that the path to your loaded data is specified in the utils/datasets.py file.

Dependencies

Python 3.8.6 has been utilized for this project. Install the required Python packages using the following command:

$ pip install -r requirements.txt

Training and testing

For end-to-end training and testing, you can use the main.py script, which combines the embeddings of RP and FS. The RP and FS pretext tasks are located in the pretext_tasks directory. In the combining_emb.py file, these features can be combined and provided as input for the downstream task, as well as for generating UMAP visualizations. The utilization of the stored models instead of training the model is also possible, and can be achieved in the pretext_tasks/RP.py and pretext_tasks/similari_pick.py files respectively.

If you have any questions or need further assistance, please don't hesitate to reach out via email:

Citation

If you use our data and code, please cite the paper using the following bibtex reference:

@misc{kazatzidis2023novel,
      title={A novel dual-stream time-frequency contrastive pretext tasks framework for sleep stage classification}, 
      author={Sergio Kazatzidis and Siamak Mehrkanoon},
      year={2023},
      eprint={2312.09623},
      archivePrefix={arXiv},
      primaryClass={eess.SP}
}

About

A new pretext introduced for SSL in the EEG domain

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages