Relaxed conditions for universal approximation by radial basis function neural networks of Hankel translates
Radial basis function neural networks (RBFNNs) of Hankel translates of order $ \mu > -1/2 $ with varying widths whose activation function $ \sigma $ is a.e. continuous, such that $ z^{-\mu-1/2}\sigma(z) $ is locally essentially bounded and not an even polynomial, are shown to enjoy the universal...
Saved in:
| Published in | AIMS mathematics Vol. 10; no. 5; pp. 10852 - 10865 |
|---|---|
| Main Author | |
| Format | Journal Article |
| Language | English |
| Published |
AIMS Press
01.05.2025
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2473-6988 2473-6988 |
| DOI | 10.3934/math.2025493 |
Cover
| Summary: | Radial basis function neural networks (RBFNNs) of Hankel translates of order $ \mu > -1/2 $ with varying widths whose activation function $ \sigma $ is a.e. continuous, such that $ z^{-\mu-1/2}\sigma(z) $ is locally essentially bounded and not an even polynomial, are shown to enjoy the universal approximation property (UAP) in appropriate spaces of continuous and integrable functions. In this way, the requirement that $ \sigma $ be continuous for this kind of networks to achieve the UAP is weakened, and some results that hold true for RBFNNs of standard translates are extended to RBFNNs of Hankel translates. |
|---|---|
| ISSN: | 2473-6988 2473-6988 |
| DOI: | 10.3934/math.2025493 |