2D-plus-depth based resolution and frame-rate up-conversion technique for depth video

We propose a novel framework for upconversion of depth video resolution in both spatial and time domains considering spatial and temporal coherences. Although the Time-of-Flight (TOF) sensor which is widely used in computer vision fields provides depth video in realtime, it also provides a low resol...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on consumer electronics Vol. 56; no. 4; pp. 2489 - 2497
Main Authors Choi, Jinwook, Min, Dongbo, Sohn, Kwanghoon
Format Journal Article
LanguageEnglish
Published New York IEEE 01.11.2010
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0098-3063
1558-4127
DOI10.1109/TCE.2010.5681132

Cover

More Information
Summary:We propose a novel framework for upconversion of depth video resolution in both spatial and time domains considering spatial and temporal coherences. Although the Time-of-Flight (TOF) sensor which is widely used in computer vision fields provides depth video in realtime, it also provides a low resolution and a low frame-rate depth video. We propose a cheaper solution that enhances depth video obtained from a TOF sensor by combining it with a Charge-coupled Device (CCD) camera in 3D contents which consist of 2D-plus-depth. Temporal fluctuation problems are also considered for temporally consistent framerate up-conversion. It is important to maintain temporal coherence in depth video, because temporal fluctuation problems may cause eye fatigue and increase bit rates on video coding. We propose a Motion Compensated Frame Interpolation (MCFI) using reliable and rich motion information from a CCD camera and 3-dimensional Joint Bilateral Up-sampling (3D JBU) extended into the temporal domain of depth video. Experimental results show that depth video obtained by the proposed method provides satisfactory quality.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0098-3063
1558-4127
DOI:10.1109/TCE.2010.5681132