Concept Neural Network Based on Time-Delay Regret for Dynamic Stream Learning

Dynamic stream learning, which emphasizes high-velocity, single-pass, real-time responses to arriving data, is revealing new challenges to the standard machine learning paradigm. In particular, existing (deep) neural networks perform poorly when learning on data streams, as they often require having...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on pattern analysis and machine intelligence Vol. 47; no. 5; pp. 3796 - 3814
Main Author Mi, Yun-Long
Format Journal Article
LanguageEnglish
Published United States IEEE 01.05.2025
Subjects
Online AccessGet full text
ISSN0162-8828
1939-3539
2160-9292
1939-3539
DOI10.1109/TPAMI.2025.3535636

Cover

More Information
Summary:Dynamic stream learning, which emphasizes high-velocity, single-pass, real-time responses to arriving data, is revealing new challenges to the standard machine learning paradigm. In particular, existing (deep) neural networks perform poorly when learning on data streams, as they often require having access to a large amount of training data. Therefore, to address the limitations of existing neural networks in high-speed data streams with a stationary environment, we propose a novel dynamic neural network, called Concept Neural Network (ConceptNN), by combining concepts and two different online updating strategies. First, we construct a new concept space, where each concept consists of two components: the feature vector (regarded as a concept's intent) and its weight information (derived from a concept's extent), for training an initial neural network. During training, the sample weight information directly works on the loss function of ConceptNN. Second, we propose a time-delay regret theory (namely, real-time prediction, then delayed update) based on online optimization theory for data stream learning. Finally, based on time-delay regret theory, we employ two online updating paradigms (i.e., the one-by-one updating strategy and chunk-by-chunk updating strategy) to update our model in the face of new data arriving continuously in a stream, and subsequently present their upper and lower bounds. Experimental results on various datasets demonstrate that the proposed ConceptNN makes it possible to learn fast-evolving data streams with better learning performance (simultaneously considering time-cost and accuracy) than the state-of-the-art dynamic learning algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0162-8828
1939-3539
2160-9292
1939-3539
DOI:10.1109/TPAMI.2025.3535636