Packets-to-Prediction: An Unobtrusive Mechanism for Identifying Coarse-Grained Sleep Patterns with WiFi MAC Layer Traffic

A good night’s sleep is of the utmost importance for the seamless execution of our cognitive capabilities. Unfortunately, the research shows that one-third of the US adult population is severely sleep deprived. With college students as our focused group, we devised a contactless, unobtrusive mechani...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 14; p. 6631
Main Authors Jaisinghani, Dheryta, Phutela, Nishtha
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 01.07.2023
MDPI
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s23146631

Cover

More Information
Summary:A good night’s sleep is of the utmost importance for the seamless execution of our cognitive capabilities. Unfortunately, the research shows that one-third of the US adult population is severely sleep deprived. With college students as our focused group, we devised a contactless, unobtrusive mechanism to detect sleep patterns, which, contrary to existing sensor-based solutions, does not require the subject to put on any sensors on the body or buy expensive sleep sensing equipment. We named this mechanism Packets-to-Predictions(P2P) because we leverage the WiFi MAC layer traffic collected in the home and university environments to predict “sleep” and “awake” periods. We first manually established that extracting such patterns is feasible, and then, we trained various machine learning models to identify these patterns automatically. We trained six machine learning models—K nearest neighbors, logistic regression, random forest classifier, support vector classifier, gradient boosting classifier, and multilayer perceptron. K nearest neighbors gave the best performance with 87% train accuracy and 83% test accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s23146631