Evaluation of Intrinsic Image Algorithms to Detect the Shadows Cast by Static Objects Outdoors

In some automatic scene analysis applications, the presence of shadows becomes a nuisance that is necessary to deal with. As a consequence, a preliminary stage in many computer vision algorithms is to attenuate their effect. In this paper, we focus our attention on the detection of shadows cast by s...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 12; no. 10; pp. 13333 - 13348
Main Authors Isaza, Cesar, Salas, Joaquín, Raducanu, Bogdan
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 01.10.2012
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s121013333

Cover

More Information
Summary:In some automatic scene analysis applications, the presence of shadows becomes a nuisance that is necessary to deal with. As a consequence, a preliminary stage in many computer vision algorithms is to attenuate their effect. In this paper, we focus our attention on the detection of shadows cast by static objects outdoors, as the scene is viewed for extended periods of time (days, weeks) from a fixed camera and considering daylight intervals where the main source of light is the sun. In this context, we report two contributions. First, we introduce the use of synthetic images for which ground truth can be generated automatically, avoiding the tedious effort of manual annotation. Secondly, we report a novel application of the intrinsic image concept to the automatic detection of shadows cast by static objects in outdoors. We make both a quantitative and a qualitative evaluation of several algorithms based on this image representation. For the quantitative evaluation, we used the synthetic data set, while for the qualitative evaluation we used both data sets. Our experimental results show that the evaluated methods can partially solve the problem of shadow detection.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s121013333