Gradient descent learning of radial basis neural networks

This paper presents an axiomatic approach for building RBF neural networks and also proposes a supervised learning algorithm based on gradient descent for their training. This approach results in a broad variety of admissible RBF models, including those employing Gaussian radial basis functions. The...

Full description

Saved in:
Bibliographic Details
Published in1997 IEEE International Conference on Neural Networks Vol. 3; pp. 1815 - 1820 vol.3
Main Author Karayiannis, N.B.
Format Conference Proceeding
LanguageEnglish
Published IEEE 1997
Subjects
Online AccessGet full text
ISBN0780341228
9780780341227
DOI10.1109/ICNN.1997.614174

Cover

More Information
Summary:This paper presents an axiomatic approach for building RBF neural networks and also proposes a supervised learning algorithm based on gradient descent for their training. This approach results in a broad variety of admissible RBF models, including those employing Gaussian radial basis functions. The form of the radial basis functions is determined by a generator function. A sensitivity analysis explains the failure of gradient descent learning on RBF networks with Gaussian radial basis functions, which are generated by an exponential generator function. The same analysis verifies that RBF networks generated by a linear generator function are much more suitable for gradient descent learning. Experiments involving such RBF networks indicate that the proposed gradient descent algorithm guarantees fast learning and very satisfactory generalization ability.
ISBN:0780341228
9780780341227
DOI:10.1109/ICNN.1997.614174