机载LiDAR和高光谱融合实现普洱山区树种分类

[目的]通过机载遥感影像对普洱山区进行植被分类研究,为山区森林经营规划与可持续经营方案的制图提供高效应用途径。[方法]将2014年4月航拍的机载AISA Eagle II高光谱和Li DAR同步数据融合,利用点云数据提取的数字冠层高度模型(CHM)得到树种的垂直结构信息,结合经过主成分分析(PCA)的高光谱降维影像,选用支持向量机(SVM)分类器进行分类。[结果]普洱市万掌山实验区主要树种分为思茅松、西南桦、刺栲、木荷等。融合影像数据分类的总体精度和Kappa系数分别为80.54%、0.78,比单一高光谱影像数据分类精度分别提高6.55%、0.08,其中主要经营树种思茅松的制图精度达到了90....

Full description

Saved in:
Bibliographic Details
Published in林业科学研究 Vol. 29; no. 3; pp. 407 - 412
Main Author 刘怡君 庞勇 廖声熙 荚文 陈博伟 刘鲁霞
Format Journal Article
LanguageChinese
Published 中国林业科学研究院资源昆虫研究所,云南昆明,650224%中国林业科学研究院资源信息研究所,北京,100091 2016
Subjects
Online AccessGet full text
ISSN1001-1498

Cover

More Information
Summary:[目的]通过机载遥感影像对普洱山区进行植被分类研究,为山区森林经营规划与可持续经营方案的制图提供高效应用途径。[方法]将2014年4月航拍的机载AISA Eagle II高光谱和Li DAR同步数据融合,利用点云数据提取的数字冠层高度模型(CHM)得到树种的垂直结构信息,结合经过主成分分析(PCA)的高光谱降维影像,选用支持向量机(SVM)分类器进行分类。[结果]普洱市万掌山实验区主要树种分为思茅松、西南桦、刺栲、木荷等。融合影像数据分类的总体精度和Kappa系数分别为80.54%、0.78,比单一高光谱影像数据分类精度分别提高6.55%、0.08,其中主要经营树种思茅松的制图精度达到了90.24%。[结论]该方法对山区主要树种的识别是有效的,将机载Li DAR与高光谱影像融合可以有效改善分类精度。
Bibliography:LIU Yi-jun, PANG Yong, LIAO Sheng-xi, JIA Wen, CHEN Bo-wei, LIU Lu-xia ( 1. Research Institute of Resource Insects, Chinese Academy of Forestry, Kunming 650224, Yunnan, China; 2. Research Institute of Forest Resource Information Techniques, Chinese Academy of Forestry, Beijing 100091, China)
11-1221/S
LiDAR;hyperspectral image;data fusion;classification of tree species
Objective]To classify the tree species in Puer’s mountainous area by remote sensing image,and to search an efficient way to forest management planning.[Method]The AISA Eagle II hyperspectral data and air-borne LiDAR taken in April of 2014 were merged,and based on Canopy Height Model (CHM)derived from air-borne LiDAR point cloud data,the vertical structure data of target species were obtained.Then,the Principal Com-ponent Analysis (PCA)transformation was used to reduce the noise and dimension of hyperspectral image.Finally, the Support Vector Machine (SVM)approach was used to classify the main tree species of Pu’er city.[Result](1 ) The main tree s
ISSN:1001-1498