Improving robustness of robotic grasping by fusing multi-sensor

Since the visual system is susceptible to the lighting condition and surroundings changes, the accuracy for object localization of robot grasping system based on visual servo is rather poor so as to the low grasping success rate and bad robustness of the whole system. In view of such phenomenon, in...

Full description

Saved in:
Bibliographic Details
Published in2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems pp. 126 - 131
Main Authors Jun Zhang, Caixia Song, Ying Hu, Bin Yu
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2012
Subjects
Online AccessGet full text
ISBN1467325104
9781467325103
DOI10.1109/MFI.2012.6343002

Cover

More Information
Summary:Since the visual system is susceptible to the lighting condition and surroundings changes, the accuracy for object localization of robot grasping system based on visual servo is rather poor so as to the low grasping success rate and bad robustness of the whole system. In view of such phenomenon, in this paper, we propose a method of fusing binocular camera accompany with monocular vision, IR sensors, tactile sensors and encoders to design a reliable and robust grasping system that could offer real-time feedback information. In order to avoid the situation of robot grasping-nothing, we use the binocular vision supplemented by monocular camera and IR sensors to locate accurately. By analyzing the contact model and pressure between gripper and the object, a durable, non-slip rubber coating is designed to increase the fingertip's friction, What's more, Fuzzy Neural Network (FNN) method was applied to fuse the information of multiple sensors in our robot system. By monitoring force and position information in the process of grasping all the time, the system can reduce the phenomenon of slippage and crush of object as well as improve the grasping stability greatly. The experimental results show the effectiveness of our system.
ISBN:1467325104
9781467325103
DOI:10.1109/MFI.2012.6343002