SS-RNN: A Strengthened Skip Algorithm for Data Classification Based on Recurrent Neural Networks

Recurrent neural networks are widely used in time series prediction and classification. However, they have problems such as insufficient memory ability and difficulty in gradient back propagation. To solve these problems, this paper proposes a new algorithm called SS-RNN, which directly uses multipl...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in genetics Vol. 12; p. 746181
Main Authors Cao, Wenjie, Shi, Ya-Zhou, Qiu, Huahai, Zhang, Bengong
Format Journal Article
LanguageEnglish
Published Frontiers Media S.A 13.10.2021
Subjects
Online AccessGet full text
ISSN1664-8021
1664-8021
DOI10.3389/fgene.2021.746181

Cover

More Information
Summary:Recurrent neural networks are widely used in time series prediction and classification. However, they have problems such as insufficient memory ability and difficulty in gradient back propagation. To solve these problems, this paper proposes a new algorithm called SS-RNN, which directly uses multiple historical information to predict the current time information. It can enhance the long-term memory ability. At the same time, for the time direction, it can improve the correlation of states at different moments. To include the historical information, we design two different processing methods for the SS-RNN in continuous and discontinuous ways, respectively. For each method, there are two ways for historical information addition: 1) direct addition and 2) adding weight weighting and function mapping to activation function. It provides six pathways so as to fully and deeply explore the effect and influence of historical information on the RNNs. By comparing the average accuracy of real datasets with long short-term memory, Bi-LSTM, gated recurrent units, and MCNN and calculating the main indexes (Accuracy, Precision, Recall, and F1-score), it can be observed that our method can improve the average accuracy and optimize the structure of the recurrent neural network and effectively solve the problems of exploding and vanishing gradients.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Edited by: Robert Friedman, Retired, Columbia, SC, United States
This article was submitted to Computational Genomics, a section of the journal Frontiers in Genetics
Reviewed by: Huang Yu-an, Shenzhen University, China
Hong Peng, South China University of Technology, China
ISSN:1664-8021
1664-8021
DOI:10.3389/fgene.2021.746181