A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBP...

Full description

Saved in:
Bibliographic Details
Published inCircuits, systems, and signal processing Vol. 37; no. 2; pp. 593 - 612
Main Authors Khan, Shujaat, Ahmad, Jawwad, Naseem, Imran, Moinuddin, Muhammad
Format Journal Article
LanguageEnglish
Published New York Springer US 01.02.2018
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0278-081X
1531-5878
DOI10.1007/s00034-017-0572-z

Cover

More Information
Summary:In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0278-081X
1531-5878
DOI:10.1007/s00034-017-0572-z