Multifeature video modularized arm movement algorithm evaluation and simulation

With the rapid development of artificial intelligence applications, the practical value of robotic arms is becoming increasingly important. Traditional robotic arms can only grab objects along a preplanned route, and it is difficult to obtain external information. If the surrounding environment is u...

Full description

Saved in:
Bibliographic Details
Published inNeural computing & applications Vol. 35; no. 12; pp. 8637 - 8646
Main Author Zhao, Xiaofang
Format Journal Article
LanguageEnglish
Published London Springer London 01.04.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0941-0643
1433-3058
DOI10.1007/s00521-022-08060-0

Cover

More Information
Summary:With the rapid development of artificial intelligence applications, the practical value of robotic arms is becoming increasingly important. Traditional robotic arms can only grab objects along a preplanned route, and it is difficult to obtain external information. If the surrounding environment is unknown or has changed, the robotic arm needs to be redesigned. Otherwise, grabbing will be difficult. To ensure the coordination ability of the automatic control system of a robotic arm and for the robot to be able to independently recognize the surrounding environment, robotic arm control systems based on multifeature video have gradually become popular. These systems also help to address the problem of independent grasping under unknown conditions. In this study, a multifeature video-based modular robotic arm motion device was built, and the relevant performance of the robotic arm was verified by experiments. The experimental results show that the relative error between the multifeature video vision system and the laser rangefinder is 1.16% at minimum and 3.12% at maximum. The grasping success rate reached 88.9%, and the robotic arm motion device could meet the expected requirements.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-022-08060-0