Fast Traffic Sign Recognition Using Color Segmentation and Deep Convolutional Networks

The use of Computer Vision techniques for the automatic recognition of road signs is fundamental for the development of intelligent vehicles and advanced driver assistance systems. In this paper, we describe a procedure based on color segmentation, Histogram of Oriented Gradients (HOG), and Convolut...

Full description

Saved in:
Bibliographic Details
Published inAdvanced Concepts for Intelligent Vision Systems Vol. 10016; pp. 205 - 216
Main Authors Youssef, Ali, Albani, Dario, Nardi, Daniele, Bloisi, Domenico Daniele
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2016
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319486796
3319486799
ISSN0302-9743
1611-3349
1611-3349
DOI10.1007/978-3-319-48680-2_19

Cover

More Information
Summary:The use of Computer Vision techniques for the automatic recognition of road signs is fundamental for the development of intelligent vehicles and advanced driver assistance systems. In this paper, we describe a procedure based on color segmentation, Histogram of Oriented Gradients (HOG), and Convolutional Neural Networks (CNN) for detecting and classifying road signs. Detection is speeded up by a preprocessing step to reduce the search space, while classification is carried out by using a Deep Learning technique. A quantitative evaluation of the proposed approach has been conducted on the well-known German Traffic Sign data set and on the novel Data set of Italian Traffic Signs (DITS), which is publicly available and contains challenging sequences captured in adverse weather conditions and in an urban scenario at night-time. Experimental results demonstrate the effectiveness of the proposed approach in terms of both classification accuracy and computational speed.
Bibliography:A. Youssef and D. Albani—These two authors contributed equally to the work.
ISBN:9783319486796
3319486799
ISSN:0302-9743
1611-3349
1611-3349
DOI:10.1007/978-3-319-48680-2_19