Encompass obstacle image detection method based on U-V disparity map and RANSAC algorithm
With the rapid development of autonomous driving technology, obstacle image detection has become an important problem that autonomous vehicles must solve. Obstacle image detection accuracy directly affects the safety and reliability of autonomous vehicles. Currently, these methods often face issues...
Saved in:
| Published in | Scientific reports Vol. 15; no. 1; pp. 6164 - 18 |
|---|---|
| Main Author | |
| Format | Journal Article |
| Language | English |
| Published |
London
Nature Publishing Group UK
20.02.2025
Nature Publishing Group Nature Portfolio |
| Subjects | |
| Online Access | Get full text |
| ISSN | 2045-2322 2045-2322 |
| DOI | 10.1038/s41598-025-89785-5 |
Cover
| Summary: | With the rapid development of autonomous driving technology, obstacle image detection has become an important problem that autonomous vehicles must solve. Obstacle image detection accuracy directly affects the safety and reliability of autonomous vehicles. Currently, these methods often face issues such as sensitivity to lighting and weather conditions. In response to these problems, research has been conducted to combine U-V disparity maps for obstacle detection. This map is used for coarse filtering of non-road disparity and finding disparity coordinates and other information for each line segment in the disparity map based on projection information. Then, a random sampling consistency algorithm is combined to perform road line fitting and remove noise. Finally, a new obstacle image detection method is designed. The results showed that the classification loss value was 0.013, the generalized intersection to union ratio loss was 0.0072, the target loss converged to 0.0026, and the accuracy of the algorithm reached over 95%. The findings of this study offer novel insights into the advancement of obstacle image detection technology, with potential applications in autonomous driving and image recognition. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ISSN: | 2045-2322 2045-2322 |
| DOI: | 10.1038/s41598-025-89785-5 |