Video watercolorization using bidirectional texture advection

In this paper, we present a method for creating watercolor-like animation, starting from video as input. The method involves two main steps: applying textures that simulate a watercolor appearance; and creating a simplified, abstracted version of the video to which the texturing operations are appli...

Full description

Saved in:
Bibliographic Details
Published inACM transactions on graphics Vol. 26; no. 3; pp. 104 - 104:7
Main Authors Bousseau, Adrien, Neyret, Fabrice, Thollot, Joelle, Salesin, David
Format Conference Proceeding Journal Article
LanguageEnglish
Published Association for Computing Machinery 2007
Subjects
Online AccessGet full text
ISSN0730-0301
1557-7368
DOI10.1145/1275808.1276507

Cover

More Information
Summary:In this paper, we present a method for creating watercolor-like animation, starting from video as input. The method involves two main steps: applying textures that simulate a watercolor appearance; and creating a simplified, abstracted version of the video to which the texturing operations are applied. Both of these steps are subject to highly visible temporal artifacts, so the primary technical contributions of the paper are extensions of previous methods for texturing and abstraction to provide temporal coherence when applied to video sequences. To maintain coherence for textures, we employ texture advection along lines of optical flow. We furthermore extend previous approaches by incorporating advection in both forward and reverse directions through the video, which allows for minimal texture distortion, particularly in areas of disocclusion that are otherwise highly problematic. To maintain coherence for abstraction, we employ mathematical morphology extended to the temporal domain, using filters whose temporal extents are locally controlled by the degree of distortions in the optical flow. Together, these techniques provide the first practical and robust approach for producing watercolor animations from video, which we demonstrate with a number of examples.
Bibliography:SourceType-Conference Papers & Proceedings-1
ObjectType-Conference Paper-1
content type line 25
ISSN:0730-0301
1557-7368
DOI:10.1145/1275808.1276507