By Seongju Lee, Yeonguk Yu, Seunghyeok Back, Hogeon Seo, and Kyoobin Lee
This repo is the official implementation of "SleePyCo: Automatic Sleep Scoring with Feature Pyramid and Contrastive Learning", accepted to Expert Systems With Applications (I.F. 8.5).
- (2023.03.03) Official repository of SleePyCo is released
- Script for preprocessing Sleep-EDF
- Config files for training from scratch
- (2023.11.09) Config files for ablation studies
- Add TinySleepNet baseline and Fix minor errors
- You can download checkpoints more conveniently!
- (2023.11.18) 🎉Online publication is available!🎉
- Scripts for preprocessing MASS, Physio2018, SHHS
Trained and evaluated on NVIDIA GeForce RTX 3090 with python 3.8.5.
- Set up a python environment
conda create -n sleepyco python=3.8.5
conda activate sleepyco
-
Install PyTorch with compatible version to your develop env from PyTorch official website.
-
Install remaining libraries using the following command.
pip install -r requirements.txt
- Download
Sleep-EDF-201X
dataset via following command. (X
will be3
or8
)
cd ./dset/Sleep-EDF-201X
python download_sleep-edf-201X.py
- Check the directory structure as follows
./dset/
└── Sleep-EDF-201X/
└── edf/
├── SC4001E0-PSG.edf
├── SC4001EC-Hypnogram.edf
├── SC4002E0-PSG.edf
├── SC4002EC-Hypnogram.edf
├── ...
- Preprocess
.edf
files into.npz
.
python prepare_sleep-edf-201X.py
- Check the directory structure as follows
./dset/
└── Sleep-EDF-201X/
├── edf/
│ ├── SC4001E0-PSG.edf
│ ├── SC4001EC-Hypnogram.edf
│ ├── SC4002E0-PSG.edf
│ ├── SC4002EC-Hypnogram.edf
│ ├── ...
│
└── npz/
├── SC4001E0-PSG.npz
├── SC4002E0-PSG.npz
├── ...
python train_crl.py --config configs/SleePyCo-Transformer_SL-01_numScales-1_{DATASET_NAME}_pretrain.json --gpu $GPU_IDs
When one GeForce RTX 3090 GPU is used, it may requires 22.3 GB of GPU memory.
python train_mtcl.py --config configs/SleePyCo-Transformer_SL-10_numScales-3_{DATASET_NAME}_freezefinetune.json --gpu $GPU_IDs
When two GeForce RTX 3090 GPU is used, it may requires 16.7 GB of GPU memory each.
If you use PyTorch
python train_mtcl.py --config configs/SleePyCo-Transformer_SL-10_numScales-3_{DATASET_NAME}_scratch.json --gpu $GPU_IDs
Dataset | Subset | Channel | ACC | MF1 | Kappa | W | N1 | N2 | N3 | REM | Checkpoints |
---|---|---|---|---|---|---|---|---|---|---|---|
Sleep-EDF-2013 | SC | Fpz-Cz | 86.8 | 81.2 | 0.820 | 91.5 | 50.0 | 89.4 | 89.0 | 86.3 | Link |
Sleep-EDF-2018 | SC | Fpz-Cz | 84.6 | 79.0 | 0.787 | 93.5 | 50.4 | 86.5 | 80.5 | 84.2 | Link |
MASS | SS1-SS5 | C4-A1 | 86.8 | 82.5 | 0.811 | 89.2 | 60.1 | 90.4 | 83.8 | 89.1 | Link |
Physio2018 | - | C3-A2 | 80.9 | 78.9 | 0.737 | 84.2 | 59.3 | 85.3 | 79.4 | 86.3 | Link |
SHHS | shhs-1 | C4-A1 | 87.9 | 80.7 | 0.830 | 92.6 | 49.2 | 88.5 | 84.5 | 88.6 | Link |
- Download and extract checkpoint using
python download_checkpoints.py
.
-
You can download all checkpoints using following command:
cd checkpoints python download_checkpoints.py
-
You can also select checkpoints as follows:
cd checkpoints python download_checkpoints.py --datasets 'Sleep-EDF-2013' 'Sleep-EDF-2018'
- Evaluate the dataset using the following command.
python test.py --config configs/SleePyCo-Transformer_SL-10_numScales-3_{DATASET_NAME}_freezefinetune.json --gpu $GPU_IDs
- SeongjuLee [GoogleScholar] [GitHub]
- Yeonguk Yu [GoogleScholar] [GitHub]
- Seunghyeok Back [GoogleScholar] [GitHub]
- Hogeon Seo [GoogleScholar] [GitHub]
- Kyoobin Lee (Corresponding Author) [GoogleScholar]
The source code of this repository is released only for academic use. See the license file for details.
@article{lee2024sleepyco,
title = {SleePyCo: Automatic sleep scoring with feature pyramid and contrastive learning},
journal = {Expert Systems with Applications},
volume = {240},
pages = {122551},
year = {2024},
issn = {0957-4174},
doi = {https://doi.org/10.1016/j.eswa.2023.122551},
url = {https://www.sciencedirect.com/science/article/pii/S0957417423030531},
author = {Seongju Lee and Yeonguk Yu and Seunghyeok Back and Hogeon Seo and Kyoobin Lee}
}
This research was supported by a grant from the Institute of Information and Communications Technology Planning and Evaluation (IITP) funded by the Korean government (MSIT) (No. 2020-0-00857, Development of cloud robot intelligence augmentation, sharing and framework technology to integrate and enhance the intelligence of multiple robots). Furthermore, this research was partially supported by the Korea Institute of Energy Technology Evaluation and Planning (KETEP) grant funded by the Korean government (MOTIE) (No. 20202910100030).