Adaptive Multipattern Fast Block-Matching Algorithm Based on Motion Classification Techniques

In most video coding standards, motion estimation becomes the most time-consuming subsystem. Consequently, in the last few years, a great deal of effort has been devoted to the research of novel algorithms capable of saving computations with minimal effects on the coding quality. Adaptive algorithms...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on circuits and systems for video technology Vol. 18; no. 10; pp. 1369 - 1382
Main Authors Gonzalez-Diaz, I., Diaz-de-Maria, F.
Format Journal Article
LanguageEnglish
Published New York IEEE 01.10.2008
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1051-8215
1558-2205
DOI10.1109/TCSVT.2008.2004917

Cover

More Information
Summary:In most video coding standards, motion estimation becomes the most time-consuming subsystem. Consequently, in the last few years, a great deal of effort has been devoted to the research of novel algorithms capable of saving computations with minimal effects on the coding quality. Adaptive algorithms and particularly multipattern solutions, have evolved as the most robust general-purpose solutions owing to two main reasons: 1) real video sequences usually exhibit a wide-range of motion content, from uniform to random, and 2) a vast amount of coding applications have appeared demanding different degrees of coding quality. In this study, we propose an adaptive algorithm, called motion classification-based search (MCS), which makes use of an especially tailored classifier that detects some motion cues and chooses the search pattern that best fits them. The MCS has been experimentally assessed for a comprehensive set of selected video sequences and qualities. Our experimental results show that MCS notably reduces the computational cost up to 55% and 84% in search points, with respect to two state-of-the-art methods, while maintaining the quality.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2008.2004917