A general backpropagation algorithm for feedforward neural networks learning

A general backpropagation algorithm is proposed for feedforward neural network learning with time varying inputs. The Lyapunov function approach is used to rigorously analyze the convergence of weights, with the use of the algorithm, toward minima of the error function. Sufficient conditions to guar...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural networks Vol. 13; no. 1; pp. 251 - 254
Main Authors Xinghuo Yu, Efe, M.O., Kaynak, O.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2002
Subjects
Online AccessGet full text
ISSN1045-9227
DOI10.1109/72.977323

Cover

More Information
Summary:A general backpropagation algorithm is proposed for feedforward neural network learning with time varying inputs. The Lyapunov function approach is used to rigorously analyze the convergence of weights, with the use of the algorithm, toward minima of the error function. Sufficient conditions to guarantee the convergence of weights for time varying inputs are derived. It is shown that most commonly used backpropagation learning algorithms are special cases of the developed general algorithm.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:1045-9227
DOI:10.1109/72.977323