Video Super-Resolution Using Simultaneous Motion and Intensity Calculations

In this paper, we propose an energy-based algorithm for motion-compensated video super-resolution (VSR) targeted on upscaling of standard definition (SD) video to high-definition (HD) video. Since the motion (flow field) of the image sequence is generally unknown, we introduce a formulation for the...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 20; no. 7; pp. 1870 - 1884
Main Authors Keller, S H, Lauze, F, Nielsen, M
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.07.2011
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1057-7149
1941-0042
1941-0042
DOI10.1109/TIP.2011.2106793

Cover

More Information
Summary:In this paper, we propose an energy-based algorithm for motion-compensated video super-resolution (VSR) targeted on upscaling of standard definition (SD) video to high-definition (HD) video. Since the motion (flow field) of the image sequence is generally unknown, we introduce a formulation for the joint estimation of a super-resolution (SR) sequence and its flow field. Via the calculus of variations, this leads to a coupled system of partial differential equations for image sequence and motion estimation. We solve a simplified form of this system and, as a by-product, we indeed provide a motion field for super-resolved sequences. To the best of our knowledge, computing super-resolved flows has not been done before. Most advanced SR methods found in literature cannot be applied to general video with arbitrary scene content and/or arbitrary optical flows, as it is possible with our simultaneous VSR method. A series of experiments shows that our method outperforms other VSR methods when dealing with general video input and that it continues to provide good results even for large scaling factors up to 8 × 8.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2011.2106793