多尺度卷积递归神经网络的RGB—D物体识别

为充分利用RGB—D图像提供的潜在特征信息,提出了多尺度卷积递归神经网络算法(muhi-scale convo-lutional-recursive neural networks,Ms-CRNN)。该算法对RGB—D图像的RGB图、灰度图、深度图及3D曲面法线图进行不同尺度分块形成多个通道,每个通道与相应尺寸的滤波器卷积,提取的特征图经局部对比度标准化和下采样后,作为递归神经网络(recursive neural networks,RNN)层的输入以得到更加抽象的高层特征;融合后的多尺度特征由SVM分类器进行分类。基于RGB—D数据集的仿真实验结果表明,综合利用RGB—D图像的多尺度特征,提...

Full description

Saved in:
Bibliographic Details
Published in计算机应用研究 Vol. 34; no. 9; pp. 2834 - 2837
Main Author 骆健 蒋旻 刘星 周龙
Format Journal Article
LanguageChinese
Published 武汉科技大学计算机科学与技术学院智能信息处理与实时工业系统湖北省重点实验室,武汉,430065 2017
Subjects
Online AccessGet full text
ISSN1001-3695
DOI10.3969/j.issn.1001-3695.2017.09.060

Cover

More Information
Summary:为充分利用RGB—D图像提供的潜在特征信息,提出了多尺度卷积递归神经网络算法(muhi-scale convo-lutional-recursive neural networks,Ms-CRNN)。该算法对RGB—D图像的RGB图、灰度图、深度图及3D曲面法线图进行不同尺度分块形成多个通道,每个通道与相应尺寸的滤波器卷积,提取的特征图经局部对比度标准化和下采样后,作为递归神经网络(recursive neural networks,RNN)层的输入以得到更加抽象的高层特征;融合后的多尺度特征由SVM分类器进行分类。基于RGB—D数据集的仿真实验结果表明,综合利用RGB—D图像的多尺度特征,提出的Ms—CRNN算法在物体识别率上达到88.2%,与先前方法相比有了较大的提高。
Bibliography:51-1196/TP
In order to fully utilize potential feature information of RGB-D images, this paper proposed a new algorithm called Ms-CRNN. It applied the nmhi-scale block operation to RGB image, gray imge, depth image and 3D surface normal map from input RGB-D image to form several channels, and convolved each channel with corresponding size of filter. Then,it performed loeal contrast normalization and subsampling on the extracted feature maps to obtain low-level invariant features, which were given as inputs to recu,:sive neural networks in order to compose higher order features. It sent vectors combining multi-scale features from all the channels to a SVM classifier for classification. It evaluated the proposed method on RGB-D dataset. Experimental results show that the recognition accuracy of the proposed method for RGB-D objects can reach 88.2%, and has certainly increased the recognition accuracy.
multi-scale; 3D surface normal; recursive neural networks; RGB-D object recognition
Luo Jian, Jiang Min, Liu Xing,
ISSN:1001-3695
DOI:10.3969/j.issn.1001-3695.2017.09.060