基于稀疏编码金字塔模型的农田害虫图像识别

相较于一般物体的图像,农作物害虫图像因具有复杂的农田环境背景,分类与识别更加困难。为提高害虫图像识别的准确率,该文提出一种基于图像稀疏编码与空间金字塔模型相结合的害虫图像表示与识别方法。该方法利用大量非标注的自然图像块构造过完备学习字典,并运用该学习字典实现对害虫图像的多空间稀疏表示。与此同时,结合多核学习,该文设计了一种害虫图像识别算法。通过对35种害虫的识别,试验结果表明:在相同方法下,该文所提特征提取方法可使平均识别精度提高9.5百分点;此外,进一步通过对221种昆虫及20种蝴蝶的识别,试验结果表明:与传统方法相比较,该文所提方法使得平均识别精度提高14.1百分点。...

Full description

Saved in:
Bibliographic Details
Published in农业工程学报 Vol. 32; no. 17; pp. 144 - 151
Main Author 谢成军 李瑞 董伟 宋良图 张洁 陈红波 陈天娇
Format Journal Article
LanguageChinese
Published 中国科学院合肥智能机械研究所,合肥,230031%安徽省农业科学院农业经济与信息研究所,合肥,230031 2016
Subjects
Online AccessGet full text
ISSN1002-6819
DOI10.11975/j.issn.1002-6819.2016.17.020

Cover

More Information
Summary:相较于一般物体的图像,农作物害虫图像因具有复杂的农田环境背景,分类与识别更加困难。为提高害虫图像识别的准确率,该文提出一种基于图像稀疏编码与空间金字塔模型相结合的害虫图像表示与识别方法。该方法利用大量非标注的自然图像块构造过完备学习字典,并运用该学习字典实现对害虫图像的多空间稀疏表示。与此同时,结合多核学习,该文设计了一种害虫图像识别算法。通过对35种害虫的识别,试验结果表明:在相同方法下,该文所提特征提取方法可使平均识别精度提高9.5百分点;此外,进一步通过对221种昆虫及20种蝴蝶的识别,试验结果表明:与传统方法相比较,该文所提方法使得平均识别精度提高14.1百分点。
Bibliography:11-2047/S
image recognition; algorithms; pest control; dictionary learning; sparse coding; spatial pyramid model
Automatic classification of insect species in field crops such as corn, soybean, wheat, and canola is more difficult than the generic object classification because of complex background in filed and high appearance similarity among insect species. In this paper, we propose an insect recognition system on the basis of advanced sparse coding and spatial pyramid model. We firstly learn features from a large amount of unlabeled insect image patches to construct an over-complete dictionary. The sparse coding of insect image patches is obtained by encoding over the dictionary. To enhance discriminative ability of the sparse coding, we then apply multiple scales of filters coupled with different spaces. Finally, the multiple space features of sparse coding are seamlessly embedded into a multi-kernel framework for robust classification. Traditionally, insect recognition has mainly relied on manual identifica
ISSN:1002-6819
DOI:10.11975/j.issn.1002-6819.2016.17.020