From Zhang Neural Network to Newton Iteration for Matrix Inversion

Different from gradient-based neural networks, a special kind of recurrent neural network (RNN) has recently been proposed by Zhang for online matrix inversion. Such an RNN is designed based on a matrix-valued error function instead of a scalar-valued error function. In addition, it was depicted in...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on circuits and systems. I, Regular papers Vol. 56; no. 7; pp. 1405 - 1415
Main Authors Yunong Zhang, Yunong Zhang, Weimu Ma, Weimu Ma, Binghuang Cai, Binghuang Cai
Format Journal Article
LanguageEnglish
Published New York IEEE 01.07.2009
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1549-8328
1558-0806
DOI10.1109/TCSI.2008.2007065

Cover

More Information
Summary:Different from gradient-based neural networks, a special kind of recurrent neural network (RNN) has recently been proposed by Zhang for online matrix inversion. Such an RNN is designed based on a matrix-valued error function instead of a scalar-valued error function. In addition, it was depicted in an implicit dynamics instead of an explicit dynamics. In this paper, we develop and investigate a discrete-time model of Zhang neural network (termed as such and abbreviated as ZNN for presentation convenience), which is depicted by a system of difference equations. Comparing with Newton iteration for matrix inversion, we find that the discrete-time ZNN model incorporates Newton iteration as its special case. Noticing this relation, we perform numerical comparisons on different situations of using ZNN and Newton iteration for matrix inversion. Different kinds of activation functions and different step-size values are examined for superior convergence and better stability of ZNN. Numerical examples demonstrate the efficacy of both ZNN and Newton iteration for online matrix inversion.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:1549-8328
1558-0806
DOI:10.1109/TCSI.2008.2007065