基于脑电和肌电多特征的自动睡眠分期方法

为实现准确的自动睡眠分期,且满足泛化能力的需求,基于脑电(EEG)和肌电(EMG)多特征,提出一种自动睡眠分期方法。以MIT-BIH多导睡眠数据库中样本的EEG和EMG为分析对象,采用离散小波变换对原始数据进行滤波预处理,提取EEG的α,β,θ,δ节律波和高频成分的能量比,利用样本熵算法提取EEG的非线性特征。将特征参数输入支持向量机分类器中进行样本训练与分类识别。实验结果表明,该方法的分期准确率可以达到92.94%,相比基于EEG的睡眠分期方法平均准确率提高3.96%,交叉验证平均准确率达82.68%,具有较好的泛化能力。...

Full description

Saved in:
Bibliographic Details
Published in计算机工程 Vol. 43; no. 10; pp. 283 - 288
Main Author 吕甜甜 王心醉 俞乾 于涌 蒋蓁
Format Journal Article
LanguageChinese
Published 上海大学机电工程与自动化学院,上海200072 2017
中国科学院苏州生物医学工程技术研究所,江苏苏州215163%中国科学院苏州生物医学工程技术研究所,江苏苏州,215163%上海大学机电工程与自动化学院,上海,200072
Subjects
Online AccessGet full text
ISSN1000-3428
DOI10.3969/j.issn.1000-3428.2017.10.047

Cover

More Information
Summary:为实现准确的自动睡眠分期,且满足泛化能力的需求,基于脑电(EEG)和肌电(EMG)多特征,提出一种自动睡眠分期方法。以MIT-BIH多导睡眠数据库中样本的EEG和EMG为分析对象,采用离散小波变换对原始数据进行滤波预处理,提取EEG的α,β,θ,δ节律波和高频成分的能量比,利用样本熵算法提取EEG的非线性特征。将特征参数输入支持向量机分类器中进行样本训练与分类识别。实验结果表明,该方法的分期准确率可以达到92.94%,相比基于EEG的睡眠分期方法平均准确率提高3.96%,交叉验证平均准确率达82.68%,具有较好的泛化能力。
Bibliography:31-1289/TP
LU Tiantian1,2 ,WANG Xinzui2, YU Qian2, YU Yong2, JIANG Zhen1 ( 1. College of Electromechanical Engineering and Automation, Shanghai University, Shanghai 200072,China ; 2. S uzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou, Jiangsu 215163, China)
sleep staging ; Electroencephalogram (EEG) ; Electromyogram (EMG) ; Discrete Wavelet Transform (DWT) ;energy feature ; sample entropy; Support Vector Machine (SVM)
In order to achieve accurate automatic sleep staging and meet the needs of generalization ability, an automatic sleep staging method based on multi-features of Electroencephalogram (EEG) and Electromyography (EMG) is proposed. EEG and EMG of MIT-BIH polysomnographic database samples are chosen as the analysis object. The Discrete Wavelet Transform (DWT) is used to make filter processing of the raw data. The energy ratio of α,β,θ,δ rhythm waves and the high frequency component from EMG are extracted. The nonlinear characteristics of EEG are also extracted
ISSN:1000-3428
DOI:10.3969/j.issn.1000-3428.2017.10.047