A general backpropagation algorithm for feedforward neural networks learning
A general backpropagation algorithm is proposed for feedforward neural network learning with time varying inputs. The Lyapunov function approach is used to rigorously analyze the convergence of weights, with the use of the algorithm, toward minima of the error function. Sufficient conditions to guar...
Saved in:
| Published in | IEEE transactions on neural networks Vol. 13; no. 1; pp. 251 - 254 |
|---|---|
| Main Authors | , , |
| Format | Journal Article |
| Language | English |
| Published |
United States
IEEE
01.01.2002
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 1045-9227 |
| DOI | 10.1109/72.977323 |
Cover
| Summary: | A general backpropagation algorithm is proposed for feedforward neural network learning with time varying inputs. The Lyapunov function approach is used to rigorously analyze the convergence of weights, with the use of the algorithm, toward minima of the error function. Sufficient conditions to guarantee the convergence of weights for time varying inputs are derived. It is shown that most commonly used backpropagation learning algorithms are special cases of the developed general algorithm. |
|---|---|
| Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 ObjectType-Article-1 ObjectType-Feature-2 |
| ISSN: | 1045-9227 |
| DOI: | 10.1109/72.977323 |