基于图像空间相关性与纹理的HEVC块划分快速算法

为降低高效率视频编码(HEVC)帧内编码算法复杂度。提出一种基于视频内容空间相关性和纹理复杂度的块划分快速算法。首先,通过分析视频测试序列相邻编码树单元(CTU)之间的划分深度范围的相关性。提出CTU最有可能深度范围(MPDR)的概念及计算方法。其次,通过检测相邻CTU之间的边缘方向分析邻近CTU间的纹理差异,并决定当前待编码CTU的划分深度范围是采用MPDR,还是采用标准规定的0—3。最后,通过统计分析得到纹理复杂的阈值计算式,在每个编码单元(CU)编码之前,通过比较像素方差与阈值来分析该CU的纹理复杂程度,并判断是否可以跳过该深度块的编码计算而直接划分。实验结果表明,与HM13.0原有算法...

Full description

Saved in:
Bibliographic Details
Published in电信科学 Vol. 31; no. 1; pp. 32 - 40
Main Author 姚英彪 李晓娟
Format Journal Article
LanguageChinese
Published 中国通信学会 2015
人民邮电出版社有限公司
杭州电子科技大学通信工程学院 杭州310018
Subjects
Online AccessGet full text
ISSN1000-0801
DOI10.11959/j.issn.1000-0801.2015006

Cover

More Information
Summary:为降低高效率视频编码(HEVC)帧内编码算法复杂度。提出一种基于视频内容空间相关性和纹理复杂度的块划分快速算法。首先,通过分析视频测试序列相邻编码树单元(CTU)之间的划分深度范围的相关性。提出CTU最有可能深度范围(MPDR)的概念及计算方法。其次,通过检测相邻CTU之间的边缘方向分析邻近CTU间的纹理差异,并决定当前待编码CTU的划分深度范围是采用MPDR,还是采用标准规定的0—3。最后,通过统计分析得到纹理复杂的阈值计算式,在每个编码单元(CU)编码之前,通过比较像素方差与阈值来分析该CU的纹理复杂程度,并判断是否可以跳过该深度块的编码计算而直接划分。实验结果表明,与HM13.0原有算法相比.本文算法以编码比特率增加0.84%、峰值信噪比下降0.04dB为代价,能够将编码时间减少20%左右。
Bibliography:high efficiency video coding, fast block partitioning, spatial correlation, most possible depth range,texture difference, texture complexity
Yao Yingbiao, Li Xiaojuan (School of Communication Engineering, Hangzhou Dianzi University, Hangzhou 310018, China)
In order to reduce the intra coding complexity of HEVC, a fast block partitioning algorithm based on spatial correlation and image texture was proposed. Firstly, a concept of most possible depth range (MPDR) was proposed according to the correlation analysis of adjacent CTU depth range. Secondly, the difference between the current CTU texture and neighboring CTU texture was analyzed by detecting the dominant edge direction. According to the texture difference, the depth range of the current CTU was chose from MPDR and the origin range 0-3. Finally, a formula for texture complex threshold was formed by statistical calculation. Before CU coding, detect the texture complexity of the current CU by comparing the pixel variance and threshold, determine whether or n
ISSN:1000-0801
DOI:10.11959/j.issn.1000-0801.2015006