SHARP: Environment and Person Independent Activity Recognition with Commodity IEEE 802.11 Access Points
In this article we present SHARP, an original approach for obtaining human activity recognition (HAR) through the use of commercial IEEE 802.11 (Wi-Fi) devices. SHARP grants the possibility to discern the activities of different persons, across different time-spans and environments. To achieve this,...
Saved in:
| Published in | arXiv.org |
|---|---|
| Main Authors | , , , , |
| Format | Paper Journal Article |
| Language | English |
| Published |
Ithaca
Cornell University Library, arXiv.org
06.11.2022
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2331-8422 |
| DOI | 10.48550/arxiv.2103.09924 |
Cover
| Abstract | In this article we present SHARP, an original approach for obtaining human activity recognition (HAR) through the use of commercial IEEE 802.11 (Wi-Fi) devices. SHARP grants the possibility to discern the activities of different persons, across different time-spans and environments. To achieve this, we devise a new technique to clean and process the channel frequency response (CFR) phase of the Wi-Fi channel, obtaining an estimate of the Doppler shift at a radio monitor device. The Doppler shift reveals the presence of moving scatterers in the environment, while not being affected by (environment-specific) static objects. SHARP is trained on data collected as a person performs seven different activities in a single environment. It is then tested on different setups, to assess its performance as the person, the day and/or the environment change with respect to those considered at training time. In the worst-case scenario, it reaches an average accuracy higher than 95%, validating the effectiveness of the extracted Doppler information, used in conjunction with a learning algorithm based on a neural network, in recognizing human activities in a subject and environment independent way. The collected CFR dataset and the code are publicly available for replicability and benchmarking purposes. |
|---|---|
| AbstractList | IEEE Transactions on Mobile Computing (2022) In this article we present SHARP, an original approach for obtaining human
activity recognition (HAR) through the use of commercial IEEE 802.11 (Wi-Fi)
devices. SHARP grants the possibility to discern the activities of different
persons, across different time-spans and environments. To achieve this, we
devise a new technique to clean and process the channel frequency response
(CFR) phase of the Wi-Fi channel, obtaining an estimate of the Doppler shift at
a radio monitor device. The Doppler shift reveals the presence of moving
scatterers in the environment, while not being affected by
(environment-specific) static objects. SHARP is trained on data collected as a
person performs seven different activities in a single environment. It is then
tested on different setups, to assess its performance as the person, the day
and/or the environment change with respect to those considered at training
time. In the worst-case scenario, it reaches an average accuracy higher than
95%, validating the effectiveness of the extracted Doppler information, used in
conjunction with a learning algorithm based on a neural network, in recognizing
human activities in a subject and environment independent way. The collected
CFR dataset and the code are publicly available for replicability and
benchmarking purposes. In this article we present SHARP, an original approach for obtaining human activity recognition (HAR) through the use of commercial IEEE 802.11 (Wi-Fi) devices. SHARP grants the possibility to discern the activities of different persons, across different time-spans and environments. To achieve this, we devise a new technique to clean and process the channel frequency response (CFR) phase of the Wi-Fi channel, obtaining an estimate of the Doppler shift at a radio monitor device. The Doppler shift reveals the presence of moving scatterers in the environment, while not being affected by (environment-specific) static objects. SHARP is trained on data collected as a person performs seven different activities in a single environment. It is then tested on different setups, to assess its performance as the person, the day and/or the environment change with respect to those considered at training time. In the worst-case scenario, it reaches an average accuracy higher than 95%, validating the effectiveness of the extracted Doppler information, used in conjunction with a learning algorithm based on a neural network, in recognizing human activities in a subject and environment independent way. The collected CFR dataset and the code are publicly available for replicability and benchmarking purposes. |
| Author | Nicolò Dal Fabbro Rossi, Michele Garlisi, Domenico Meneghello, Francesca Tinnirello, Ilenia |
| Author_xml | – sequence: 1 givenname: Francesca surname: Meneghello fullname: Meneghello, Francesca – sequence: 2 givenname: Domenico surname: Garlisi fullname: Garlisi, Domenico – sequence: 3 fullname: Nicolò Dal Fabbro – sequence: 4 givenname: Ilenia surname: Tinnirello fullname: Tinnirello, Ilenia – sequence: 5 givenname: Michele surname: Rossi fullname: Rossi, Michele |
| BackLink | https://doi.org/10.48550/arXiv.2103.09924$$DView paper in arXiv https://doi.org/10.1109/TMC.2022.3185681$$DView published paper (Access to full text may be restricted) |
| BookMark | eNotkMtOwzAQRS0EEqX0A1hhiXWCPX4kZldVgVaqRFS6j_JwiitiFzst9O9JWzYz0tyj0cy5Q9fWWY3QAyUxT4Ugz6X_NYcYKGExUQr4FRoBYzRKOcAtmoSwJYSATEAINkKbj_l0lb_gzB6Md7bTtselbXCufXAWL2yjd3oow3ha9-Zg-iNe6dptrOnNAPyY_hPPXNe55hQtsizDKYGY0oGvdQg4d8b24R7dtOVX0JP_Pkbr12w9m0fL97fFbLqMSgEQcZ1AytqaVg1PUl5zJhmwKmWqBaCc8aSFRFRS6qbhtVRSVbIE0FWiQA3_sTF6vKw9Wyh23nSlPxYnG8XZxkA8XYidd997Hfpi6_beDjcVIAgjnFMK7A-EcWGe |
| ContentType | Paper Journal Article |
| Copyright | 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. http://creativecommons.org/licenses/by/4.0 |
| Copyright_xml | – notice: 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: http://creativecommons.org/licenses/by/4.0 |
| DBID | 8FE 8FG ABJCF ABUWG AFKRA AZQEC BENPR BGLVJ CCPQU DWQXO HCIFZ L6V M7S PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS PTHSS AKY GOX |
| DOI | 10.48550/arxiv.2103.09924 |
| DatabaseName | ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection (subscription) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest Central Technology collection ProQuest One Community College ProQuest Central Korea SciTech Premium Collection ProQuest Engineering Collection Engineering Database (subscription) ProQuest Central Premium ProQuest One Academic Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China Engineering Collection arXiv Computer Science arXiv.org |
| DatabaseTitle | Publicly Available Content Database Engineering Database Technology Collection ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest One Academic Eastern Edition ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences ProQuest Engineering Collection ProQuest One Academic UKI Edition ProQuest Central Korea Materials Science & Engineering Collection ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) Engineering Collection |
| DatabaseTitleList | Publicly Available Content Database |
| Database_xml | – sequence: 1 dbid: GOX name: arXiv.org url: http://arxiv.org/find sourceTypes: Open Access Repository – sequence: 2 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Physics |
| EISSN | 2331-8422 |
| ExternalDocumentID | 2103_09924 |
| Genre | Working Paper/Pre-Print |
| GroupedDBID | 8FE 8FG ABJCF ABUWG AFKRA ALMA_UNASSIGNED_HOLDINGS AZQEC BENPR BGLVJ CCPQU DWQXO FRJ HCIFZ L6V M7S M~E PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS PTHSS AKY GOX |
| ID | FETCH-LOGICAL-a522-4e7283fc1bd4784c436323b839f2214347f275b66edd4c6969b6a22eb79294223 |
| IEDL.DBID | GOX |
| IngestDate | Tue Jul 22 22:01:54 EDT 2025 Mon Jun 30 09:31:16 EDT 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | false |
| IsScholarly | false |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-a522-4e7283fc1bd4784c436323b839f2214347f275b66edd4c6969b6a22eb79294223 |
| Notes | SourceType-Working Papers-1 ObjectType-Working Paper/Pre-Print-1 content type line 50 |
| OpenAccessLink | https://arxiv.org/abs/2103.09924 |
| PQID | 2503044112 |
| PQPubID | 2050157 |
| ParticipantIDs | arxiv_primary_2103_09924 proquest_journals_2503044112 |
| PublicationCentury | 2000 |
| PublicationDate | 20221106 |
| PublicationDateYYYYMMDD | 2022-11-06 |
| PublicationDate_xml | – month: 11 year: 2022 text: 20221106 day: 06 |
| PublicationDecade | 2020 |
| PublicationPlace | Ithaca |
| PublicationPlace_xml | – name: Ithaca |
| PublicationTitle | arXiv.org |
| PublicationYear | 2022 |
| Publisher | Cornell University Library, arXiv.org |
| Publisher_xml | – name: Cornell University Library, arXiv.org |
| SSID | ssj0002672553 |
| Score | 1.8151647 |
| SecondaryResourceType | preprint |
| Snippet | In this article we present SHARP, an original approach for obtaining human activity recognition (HAR) through the use of commercial IEEE 802.11 (Wi-Fi)... IEEE Transactions on Mobile Computing (2022) In this article we present SHARP, an original approach for obtaining human activity recognition (HAR) through the... |
| SourceID | arxiv proquest |
| SourceType | Open Access Repository Aggregation Database |
| SubjectTerms | Algorithms Computer Science - Learning Computer Science - Networking and Internet Architecture Doppler effect Frequency response Human activity recognition Machine learning Moving object recognition Neural networks Static objects |
| SummonAdditionalLinks | – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LSwMxEA61i-DNJ61WycHr9pFks7uCSJWWKliWWqG3Ja-VXnZru4o_30m62x4Er8mcZsjMl0m--RC6hQonQxlRP5RwyO3Icz-WhvpcURKbjEUittzh1ymfvLOXRbBooGnNhbHfKuuc6BK1LpTtkfegVMPNmwE8eFh9-lY1yr6u1hIaopJW0PduxNgB8oidjNVE3uNomsx2XRfCQ8DQdPu86YZ59cT6Z_ndBWPaBbRkme-eW_qTnF3FGR8jLxErsz5BDZOfokP3UVNtztDH22Q4S-7waE9QwyLXOHHIGT_vVG1LPFRbZQg8qz8JgYHtu2LLCim03bJ3PRz1CeQ0sLeUAZwUy7zcnKP5eDR_mviVWIIvAEL5zIQAFDI1kJqFEVOMckooBCHOCAFMxMKMhIHk3GjNFI95LLkgxMgQ8BEDjHCBmnmRmxbCAVWcq3gQKAHlXYjY6Exzken-QJLAiDZqOQelq-08jNT6LnW-a6NO7bO0OgubdB-5y_-3r9ARseQC27TlHdQs11_mGkp-KW-qOP4CkRupdg priority: 102 providerName: ProQuest |
| Title | SHARP: Environment and Person Independent Activity Recognition with Commodity IEEE 802.11 Access Points |
| URI | https://www.proquest.com/docview/2503044112 https://arxiv.org/abs/2103.09924 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV2xTsMwED21ZWFBIEAtlMoDa6CxHSdmKyilILVEpUjdIttxEEuK2oCY-HbOSUoHxJIhuSzPjt87x-8O4BIZToc6Yl6o8SN3Jc89qS3zhGFU2pxHSjrv8HQmJi_8cRksW0C2Xhi1_nr7rOsD68015iPsCjUM5W1oo1BwZt6nZf1zsirF1cTv4lBjVrf-LK0VX4wP4aARemRUj8wRtGxxDK_Pk9E8uSHxzl5GMJUnSaV7ycNvT9qSjEzd14HMt0d8MMDtmhLn6Vhl7pHL1Eg0pLgiYbw78E-S1VtRbk5gMY4XdxOvaXXgKRRAHrch0nxufJ3xMOKGM8EoQwhlTikqGh7mNAy0EDbLuBFSSC0UpVaHqG44MvwpdIpVYbtAAmaEMNIPjEJyVkraLM-EyrOhr2lgVQ-6FUDpe13NInXYpRV2PehvMUubmbxJUSKxIWomn579_-Y57FNnC3DbraIPnXL9YS-QrEs9gHY0vh_A3m08S-aDavzwOv2OfwBRUpYh |
| linkProvider | Cornell University |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3JTsMwELUqKgQ3VrEU8AGOKdR2nBoJIZZWLV0UlSJxi7wF9ZKWtmwfx78xdhM4IHHjGs8hGtszb8bzZhA6Bg-nIlWnQaTgkruW54FQlgZcUyJsyupSOO5wr89bD-zuMXwsoc-CC-PKKgub6A21GWuXIz8FVw2RNwN4cDl5DtzUKPe6WozQkPloBXPhW4zlxI6O_XiDEG520b6F_T4hpNkY3rSCfMpAIAF7BMxG4GFTXVOGRXWmGeWUUPh7kRICYIJFKYlCxbk1hmkuuFBcEmJVBMCCEdf3ADxAGQQFxH7l60Y_HnwneQiPALLTxWuq7x12Kqfvo9cqBFq0CuDMEe3L_tMvX-AdXHMNlWM5sdN1VLLZBlr2daF6tome7ltXg_gcN374cFhmBsceqOP29xDdOb7Si0EUeFDUJIGAS_NiR0IZG7fkQktcPyNgQkHeMRRwPB5l89kWGv6H1rbRUjbO7A7CIdWca1ELtQQ0IaWwJjVcpuaspkho5S7a8QpKJov2G4nTXeJ1t4sqhc6S_OrNkp-Dsvf38hFaaQ173aTb7nf20SpxvAaXL-YVtDSfvtgDQBtzdZjvKUbJP5-iLys74yo |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=SHARP%3A+Environment+and+Person+Independent+Activity+Recognition+with+Commodity+IEEE+802.11+Access+Points&rft.jtitle=arXiv.org&rft.au=Meneghello%2C+Francesca&rft.au=Garlisi%2C+Domenico&rft.au=Nicol%C3%B2+Dal+Fabbro&rft.au=Tinnirello%2C+Ilenia&rft.date=2022-11-06&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422&rft_id=info:doi/10.48550%2Farxiv.2103.09924 |