Enhancing Robot Guided Navigation Through Visual Cues

Robot-guided navigation has been a success in navigating through established environments, but these systems warfare while navigating through dynamic, obstacle-encumbered environments. In addition, autonomous robot navigation faces demanding situations when trying to acquire greater contextual, sema...

Full description

Saved in:
Bibliographic Details
Published inInternational Conference on Computing, Communication, and Networking Technologies (Online) pp. 1 - 5
Main Authors Sharma, Mukesh Kumar, Sharma, Neeraj, Reddy, M.Sudhakara
Format Conference Proceeding
LanguageEnglish
Published IEEE 24.06.2024
Subjects
Online AccessGet full text
ISSN2473-7674
DOI10.1109/ICCCNT61001.2024.10724976

Cover

More Information
Summary:Robot-guided navigation has been a success in navigating through established environments, but these systems warfare while navigating through dynamic, obstacle-encumbered environments. In addition, autonomous robot navigation faces demanding situations when trying to acquire greater contextual, semantic facts approximately its environment. This trouble may be solved by means of integrating visible cues into robotic navigation algorithms, which provide the robot with greater records about the nearby surroundings. Such cues can be accrued from a ramification of sources, which include Liar, RGB-D sensors, or even camera-based algorithms. Moreover, this given information may be similarly more suitable and contextualized via deep-gaining knowledge of algorithms. As soon as these algorithms are carried out, they can be used to enhance the robotics' know how of the surroundings by supplying more records about the items and limitations it encounters. These additional contextual statistics can also facilitate extra state-of-the-art navigation techniques, inclusive of the improvement of a high-degree map of the robotics' environment or figuring out the semantic meaning of different items. Those greater strategies can allow the robot to do more as it should be planning its routes and keep away from surprising limitations, increasing the robot's autonomy. Integrating visual cues into robotic navigation algorithms can offer the robotic extra information approximately its surroundings, allowing it to plan its moves better and avoid limitations.
ISSN:2473-7674
DOI:10.1109/ICCCNT61001.2024.10724976