Making Sense of Sleep: Multimodal Sleep Stage Classification in a Large, Diverse Population …

Making Sense of Sleep: Multimodal Sleep Stage Classification in a Large, Diverse Population Using Movement and Cardiac Sensing
Bing Zhai, Ignacio Perez-Pozuelo, Emma A.D. Clifton, Joao Palotti, Yu Guan

UbiComp ’20: The ACM International Joint Conference on Pervasive and Ubiquitous Computing 2020
Session: Health and Well-being I

Abstract
Traditionally, sleep monitoring has been performed in hospital or clinic environments, requiring complex and expensive equipment set-up and expert scoring. Wearable devices increasingly provide a viable alternative for sleep monitoring, able to collect movement and heart rate (HR) data. In this work, we present a set of algorithms for sleep-wake and sleep-stage classification based upon actigraphy and cardiac sensing amongst 1,743 participants. We devise movement and cardiac features that could be extracted from research-grade wearable sensors and derive models and evaluate their performance in the largest open-access dataset for human sleep science. Our results demonstrated that neural network models outperform traditional machine learning methods and heuristic models for both sleep-wake and sleep-stage classification. Convolutional neural networks (CNNs) and long-short term memory (LSTM) networks were the best performers for sleep-wake and sleep-stage classification, respectively. Using SHAP (SHapley Additive exPlanation) with Random Forest we identified that frequency features from cardiac sensors are critical to sleep-stage classification. Finally, we introduced an ensemble-based approach to sleep-stage classification, which outperformed all other baselines, achieving an accuracy of 78.2% and $F_1$ score of 69.8% on the classification task for three sleep stages. Together, this work represents the first systematic multimodal evaluation of sleep-wake and sleep-stage classification in a large, diverse population. Alongside the presentation of an accurate sleep-stage classification approach, the results highlight multimodal wearable sensing approaches as scalable methods for accurate sleep-classification, providing guidance on optimal algorithm deployment for automated sleep assessment. The code used in this study can be found online at: ~url{https://github.com/bzhai/multimodal_sleep_stage_benchmark.git}

DOI:: https://doi.org/10.1145/3397325
WEB:: https://ubicomp.org/ubicomp2020/

Remote Presentations for ACM International Joint Conference on Pervasive and Ubiquitous Computing 2020 (UbiComp ’20)

Category: News
About The Author
-