Object Detection Using Stacked YOLOv3

Object detection is a stimulating task in the applications of computer vision. It is gaining a lot of attention in many real-time applications such as detection of number plates of suspect cars, identifying trespassers under surveillance areas, detecting unmasked faces in security gates during the C...

Full description

Saved in:
Bibliographic Details
Published inIngénierie des systèmes d'Information Vol. 25; no. 5; pp. 691 - 697
Main Authors Padmanabula, Sai Shilpa, Puvvada, Ramya Chowdary, Sistla, Venkatramaphanikumar, Kolli, Venkata Krishna Kishore
Format Journal Article
LanguageEnglish
Published 10.11.2020
Online AccessGet full text
ISSN1633-1311
2116-7125
2116-7125
DOI10.18280/isi.250517

Cover

More Information
Summary:Object detection is a stimulating task in the applications of computer vision. It is gaining a lot of attention in many real-time applications such as detection of number plates of suspect cars, identifying trespassers under surveillance areas, detecting unmasked faces in security gates during the COVID-19 period, etc. Region-based Convolution Neural Networks(R-CNN), You only Look once (YOLO) based CNNs, etc., comes under Deep Learning approaches. In this proposed work, an improved stacked Yolov3 model is designed for the detection of objects by bounding boxes. Hyperparameters are tuned to get optimum performance. The proposed model evaluated using the COCO dataset, and the performance is better than other existing object detection models. Anchor boxes are used for overlapping objects. After removing all the predicted bounding boxes that have a low detection probability, bounding boxes with the highest detection probability are selected and eliminated all the bounding boxes whose Intersection Over Union value is higher than 0.4. Non-Maximal Suppression (NMS) is used to only keep the best bounding box. In this experimentation, we have tried with various range of values, but finally got better result at threshold 0.5.
ISSN:1633-1311
2116-7125
2116-7125
DOI:10.18280/isi.250517