A multiple classifiers time-serial ensemble pruning algorithm based on the mechanism of forward supplement

Because there are lots of typical applications and urgent needs, the research on the efficient classification learning about accumulated big data in nonstationary environments has become one of the hot topics in the field of data mining recently. The LearnNSE algorithm is an important research resul...

Full description

Saved in:
Bibliographic Details
Published inApplied intelligence (Dordrecht, Netherlands) Vol. 53; no. 5; pp. 5620 - 5634
Main Authors Shen, Yan, Jing, Luyi, Gao, Tian, Song, Zizhao, Ma, Ji
Format Journal Article
LanguageEnglish
Published New York Springer US 01.03.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0924-669X
1573-7497
DOI10.1007/s10489-022-03855-z

Cover

More Information
Summary:Because there are lots of typical applications and urgent needs, the research on the efficient classification learning about accumulated big data in nonstationary environments has become one of the hot topics in the field of data mining recently. The LearnNSE algorithm is an important research result in this field. For the long-term accumulated big data, the LearnNSE-Pruned-Age, a pruning version of LearnNSE, was given, which has received widespread attentions. However, it is found that the pruning mechanism of the LearnNSE-Pruned-Age algorithm is not perfect, which lost the core ability of the LearnNSE algorithm to reuse the learned classification knowledge. Therefore, the ensemble mechanism of LearnNSE is adjusted in this paper, and a novel ensemble mechanism is designed. The new mechanism uses the integration of the latest base-classifiers to track the changes of the data generation environment, and then selects the old base-classifiers that contribute to the current classification for forward supplementary integration. On this basis, a new pruned algorithm named FLearnNSE-Pruned-Age is proposed. The experiment results show that the FLearnNSE-Pruned-Age algorithm has the ability to reuse the learned classification knowledge and it can achieve the very close classification accuracy compared to LearnNSE, even better in some scenes. In addition, it improves the efficiency of ensemble learning and is suitable for the fast classification learning of accumulated big data.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0924-669X
1573-7497
DOI:10.1007/s10489-022-03855-z