ClarenceKe / sleep_transfer_learning

Towards More Accurate Automatic Sleep Staging via Deep Transfer Learning

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Towards More Accurate Automatic Sleep Staging via Deep Transfer Learning

This repository contains source code, pretrained models, and experimental setup in the manuscript:

Sleep Transfer Learning

Data Preparation with Matlab:


SeqSleepNet

  • Change path to seqsleepnet/
  • Run preprare_data_sleepedf_sc.m to prepare SleepEDF-SC data (the path to the data must be provided, refer to the script for comments). The .mat files generated are stored in mat/ directory.
  • Run genlist_sleepedf_sc.m to generate list of SleepEDF-SC files for network training based on the data split in data_split_sleepedf_sc.mat. The files generated are stored in tf_data/ directory.
  • Run preprare_data_sleepedf_st.m to prepare SleepEDF-ST data (the path to the data must be provided refer to the script for comments). The .mat files generated are stored in mat/ directory.
  • Run genlist_sleepedf_st.m to generate list of SleepEDF-ST files for network training based on the data split in data_split_sleepedf_st.mat. The files generated are stored in tf_data/ directory.

DeepSleepNet (likewise)

Network training and evaluation with Tensorflow:


SeqSleepNet

  • Change path to seqsleepnet/tensorflow/seqsleepnet/

  • Run the example bash scripts:

    • finetune_all.sh: finetune entire a pretrained network
    • finetune_softmax_SPB.sh: finetune softmax + sequence processing block (SPB)
    • finetune_softmax_EPB.sh: finetune softmax + epoch processing block (EPB)
    • finetune_softmax.sh: finetune softmax
    • train_scratch.sh: train a network from scratch

Note: when the --pretrained_model parameter is empty, the network will be trained from scratch. Otherwise, the specified pretrained model will be loaded and finetuned with the finetuning strategy specified in the --finetune_mode

DeepSleepNet (likewise)

Note: DeepSleepNet pretrained models are quite heavy. They were uploaded separately and can be downloaded from here: https://zenodo.org/record/3375235

Evaluation

After training/finetuning and testing the network on test data:

  • Change path to seqsleepnet/ or deepsleepnet/
  • Refer to examples_evaluation.m for examples that calculates the performance metrics.

Some results:


  • Finetuning results with SeqSleepNet:

seqsleepnet_results

  • Finetuning results with DeepSleepNet:

deepsleepnet_results

Environment:


  • Matlab v7.3 (for data preparation)
  • Python3
  • Tensorflow GPU versions 1.4 - 1.14 (for network training and evaluation)
  • numpy
  • scipy
  • sklearn
  • h5py

Note on the SleepEDF Expanded Database:

The SleepEDF expanded database can be download from https://physionet.org/content/sleep-edfx/1.0.0/. The latest version of this database contains 153 subjects in the SC subset. This experiment was conducted with the previous version of the SC subset which contains 20 subjects intentionally to simulate the situation of a small cohort. If you download the new version, make sure to use 20 subjects SC400-SC419.

On the ST subset of the database, the experiments were conducted with 22 placebo recordings. Make sure that you refer to https://physionet.org/content/sleep-edfx/1.0.0/ST-subjects.xls to obtain the right recordings and subjects.

The experiments only used the in-bed parts (from light off time to light on time) of the recordings to avoid the dominance of Wake stage as suggested in

  • S. A. Imtiaz and E. Rodriguez-Villegas, An open-source toolbox for standardized use of PhysioNet Sleep EDF Expanded Database. Proc. EMBC, pp. 6014-6017, 2015.

Meta information (e.g. light off and light on times to extract the in-bed parts data from the whole day-night recordings the meta information is provided in sleepedfx_meta.

Contact:

Huy Phan
School of Electronic Engineering and Computer Science
Queen Mary University of London
Email: h.phan{at}qmul.ac.uk

License

MIT © Huy Phan

About

Towards More Accurate Automatic Sleep Staging via Deep Transfer Learning


Languages

Language:Python 47.1%Language:MATLAB 36.2%Language:Shell 16.7%