Convolutional neural networks based on fractional-order momentum for parameter training
This paper proposes a parameter training method via the fractional-order momentum for convolutional neural networks (CNNs). To update the parameters of CNNs more smoothly, the parameter training method via the fractional-order momentum is proposed based on the Grünwald-Letnikov (G-L) difference oper...
Saved in:
| Published in | Neurocomputing (Amsterdam) Vol. 449; pp. 85 - 99 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
Elsevier B.V
18.08.2021
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 0925-2312 1872-8286 |
| DOI | 10.1016/j.neucom.2021.03.075 |
Cover
| Summary: | This paper proposes a parameter training method via the fractional-order momentum for convolutional neural networks (CNNs). To update the parameters of CNNs more smoothly, the parameter training method via the fractional-order momentum is proposed based on the Grünwald-Letnikov (G-L) difference operation. The stochastic classical momentum (SCM) algorithm and adaptive moment (Adam) estimation algorithm are improved by replacing the integer-order difference with the fractional-order difference. Meanwhile, the linear and the nonlinear methods are discussed to adjust the fractional-order. Therefore, the proposed methods can improve the flexibility and the adaptive ability of CNN parameters. We analyze the validity of the methods by using MNIST dataset and CIFAR-10 dataset, and the experimental results show that the proposed methods can improve the recognition accuracy and the learning convergence speed of CNNs compared with the traditional SCM and Adam methods. |
|---|---|
| ISSN: | 0925-2312 1872-8286 |
| DOI: | 10.1016/j.neucom.2021.03.075 |