SHARP: Environment and Person Independent Activity Recognition with Commodity IEEE 802.11 Access Points

In this article we present SHARP, an original approach for obtaining human activity recognition (HAR) through the use of commercial IEEE 802.11 (Wi-Fi) devices. SHARP grants the possibility to discern the activities of different persons, across different time-spans and environments. To achieve this,...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Meneghello, Francesca, Garlisi, Domenico, Nicolò Dal Fabbro, Tinnirello, Ilenia, Rossi, Michele
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 06.11.2022
Subjects
Online AccessGet full text
ISSN2331-8422
DOI10.48550/arxiv.2103.09924

Cover

More Information
Summary:In this article we present SHARP, an original approach for obtaining human activity recognition (HAR) through the use of commercial IEEE 802.11 (Wi-Fi) devices. SHARP grants the possibility to discern the activities of different persons, across different time-spans and environments. To achieve this, we devise a new technique to clean and process the channel frequency response (CFR) phase of the Wi-Fi channel, obtaining an estimate of the Doppler shift at a radio monitor device. The Doppler shift reveals the presence of moving scatterers in the environment, while not being affected by (environment-specific) static objects. SHARP is trained on data collected as a person performs seven different activities in a single environment. It is then tested on different setups, to assess its performance as the person, the day and/or the environment change with respect to those considered at training time. In the worst-case scenario, it reaches an average accuracy higher than 95%, validating the effectiveness of the extracted Doppler information, used in conjunction with a learning algorithm based on a neural network, in recognizing human activities in a subject and environment independent way. The collected CFR dataset and the code are publicly available for replicability and benchmarking purposes.
Bibliography:SourceType-Working Papers-1
ObjectType-Working Paper/Pre-Print-1
content type line 50
ISSN:2331-8422
DOI:10.48550/arxiv.2103.09924