A fully complex-valued radial basis function network and its learning algorithm

In this paper, a fully complex-valued radial basis function (FC-RBF) network with a fully complex-valued activation function has been proposed, and its complex-valued gradient descent learning algorithm has been developed. The fully complex activation function, sech(.) of the proposed network, satis...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of neural systems Vol. 19; no. 4; p. 253
Main Authors Savitha, R, Suresh, S, Sundararajan, N
Format Journal Article
LanguageEnglish
Published Singapore 01.08.2009
Subjects
Online AccessGet more information
ISSN0129-0657
DOI10.1142/S0129065709002026

Cover

More Information
Summary:In this paper, a fully complex-valued radial basis function (FC-RBF) network with a fully complex-valued activation function has been proposed, and its complex-valued gradient descent learning algorithm has been developed. The fully complex activation function, sech(.) of the proposed network, satisfies all the properties needed for a complex-valued activation function and has Gaussian-like characteristics. It maps C(n) --> C, unlike the existing activation functions of complex-valued RBF network that maps C(n) --> R. Since the performance of the complex-RBF network depends on the number of neurons and initialization of network parameters, we propose a K-means clustering based neuron selection and center initialization scheme. First, we present a study on convergence using complex XOR problem. Next, we present a synthetic function approximation problem and the two-spiral classification problem. Finally, we present the results for two practical applications, viz., a non-minimum phase equalization and an adaptive beam-forming problem. The performance of the network was compared with other well-known complex-valued RBF networks available in literature, viz., split-complex CRBF, CMRAN and the CELM. The results indicate that the proposed fully complex-valued network has better convergence, approximation and classification ability.
ISSN:0129-0657
DOI:10.1142/S0129065709002026