Evaluating Social Touch Gesture Recognition with a Skin-Like Soft Sensor

Socially assistive robots (SARs) can act as caregivers and use touch gestures to interact with people such as children with autism. However, designing a touch perception system that can reliably detect these gestures remains a challenge. To address this gap, we replicated a Do-It-Yourself (DIY) skin...

Full description

Saved in:
Bibliographic Details
Published in2025 20th ACM/IEEE International Conference on Human-Robot Interaction (HRI) pp. 1700 - 1704
Main Authors Umesh, Tejas, Shetty, Yatiraj, Seifi, Hasti
Format Conference Proceeding
LanguageEnglish
Published IEEE 04.03.2025
Subjects
Online AccessGet full text
DOI10.1109/HRI61500.2025.10974131

Cover

More Information
Summary:Socially assistive robots (SARs) can act as caregivers and use touch gestures to interact with people such as children with autism. However, designing a touch perception system that can reliably detect these gestures remains a challenge. To address this gap, we replicated a Do-It-Yourself (DIY) skin-like sensor and evaluated its potential to identify eight social touch gestures: Fistbump, Hitting, Holding, Poking, Squeezing, Stroking, Tapping, and Tickling. Our sensor design builds on a recent silicone-based sensor to collect spatiotemporal gesture data and a load cell to capture force information. We collected touch gestures from 20 adults in a user study. Then, we built a touch perception algorithm with a Convolutional Neural Network-Long Short-Term Memory (CNN- LSTM) model, achieving 94% in gesture classification accuracy with to-fold validation and 69 % accuracy with subject-dependent splitting with the combined touch and force data.
DOI:10.1109/HRI61500.2025.10974131