Multi-photon neuron embedded bionic skin for high-precision complex texture and object reconstruction perception research

Attributable to the complex distribution of tactile vesicles under the skin and the ability of the brain to process specific tactile parameters (shape, hardness, and surface texture), human skin can have the capacity for tactile spatial reconstruction and visualization of complex object geometry and...

Full description

Saved in:
Bibliographic Details
Published inOpto-Electronic Advances Vol. 8; no. 2; p. 240152
Main Authors Zhou, Hongyu, Zhang, Chao, Nong, Hengchang, Weng, Junjie, Wang, Dongying, Yu, Yang, Zhang, Jianfa, Zhang, Chaofan, Yu, Jinran, Zhang, Zhaojian, Chen, Huan, Zhang, Zhenrong, Yang, Junbo
Format Journal Article
LanguageEnglish
Published Institue of Optics and Electronics, Chinese Academy of Sciences 2025
Subjects
Online AccessGet full text
ISSN2096-4579
DOI10.29026/oea.2025.240152

Cover

More Information
Summary:Attributable to the complex distribution of tactile vesicles under the skin and the ability of the brain to process specific tactile parameters (shape, hardness, and surface texture), human skin can have the capacity for tactile spatial reconstruction and visualization of complex object geometry and surface texture. However, current haptic sensor technologies are predominantly point sensors, which do not have an interlaced distribution structure similar to that of haptic vesicles, limiting their potential in human-computer interaction applications. Here, we report an optical microfiber array skin (OMAS) imitating tactile vesicle interlaced structures for tactile visualization and object reconstruction sensing. This device is characterized by high sensitivity (−0.83 N/V) and fast response time (38 ms). We demonstrate that combining the signals collected by the OMAS with appropriate artificial intelligence algorithms enables the recognition of objects with different hardnesses and shapes with 100% accuracy. It also allows for the classification of fabrics with different surface textures with 98.5% accuracy and Braille patterns with 99% accuracy. As a proof-of-concept, we integrated OMAS into a robot arm to select mahjong among six common objects and successfully recognize its suits by touch, which provides a new solution for tactile sensory processing for human-computer interaction.
ISSN:2096-4579
DOI:10.29026/oea.2025.240152