Wavelet neural networks are asymptotically optimal approximators for functions of one variable

Neural networks are universal approximators. For example, it has been proved (K. Hornik et al., 1989) that for every /spl epsiv/>0 an arbitrary continuous function on a compact set can be /spl epsiv/-approximated by a 3-layer neural network. This and other results prove that in principle, any fun...

Full description

Saved in:
Bibliographic Details
Published inIEEE International Conference on Neural Networks, 1994 Vol. 1; pp. 299 - 304 vol.1
Main Authors Kreinovich, V., Sirisaengtaksin, O., Cabrera, S.
Format Conference Proceeding
LanguageEnglish
Published IEEE 1994
Subjects
Online AccessGet full text
ISBN078031901X
9780780319011
DOI10.1109/ICNN.1994.374179

Cover

More Information
Summary:Neural networks are universal approximators. For example, it has been proved (K. Hornik et al., 1989) that for every /spl epsiv/>0 an arbitrary continuous function on a compact set can be /spl epsiv/-approximated by a 3-layer neural network. This and other results prove that in principle, any function (e.g., any control) can be implemented by an appropriate neural network. But why neural networks? In addition to neural networks, an arbitrary continuous function can be also approximated by polynomials, etc. What is so special about neural networks that make them preferable approximators? To compare different approximators, one can compare the number of bits that we must store in order to be able to reconstruct a function with a given precision /spl epsiv/. For neural networks, we must store weights and thresholds. For polynomials, we must store coefficients, etc. We consider functions of one variable, and show that for some special neurons (corresponding to wavelets), neural networks are optimal approximators in the sense that they require (asymptotically) the smallest possible number of bits.< >
ISBN:078031901X
9780780319011
DOI:10.1109/ICNN.1994.374179