A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks
In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBP...
Saved in:
| Published in | Circuits, systems, and signal processing Vol. 37; no. 2; pp. 593 - 612 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
New York
Springer US
01.02.2018
Springer Nature B.V |
| Subjects | |
| Online Access | Get full text |
| ISSN | 0278-081X 1531-5878 |
| DOI | 10.1007/s00034-017-0572-z |
Cover
| Summary: | In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0278-081X 1531-5878 |
| DOI: | 10.1007/s00034-017-0572-z |