Relaxed conditions for universal approximation by radial basis function neural networks of Hankel translates

Radial basis function neural networks (RBFNNs) of Hankel translates of order $ \mu > -1/2 $ with varying widths whose activation function $ \sigma $ is a.e. continuous, such that $ z^{-\mu-1/2}\sigma(z) $ is locally essentially bounded and not an even polynomial, are shown to enjoy the universal...

Full description

Saved in:
Bibliographic Details
Published inAIMS mathematics Vol. 10; no. 5; pp. 10852 - 10865
Main Author Marrero, Isabel
Format Journal Article
LanguageEnglish
Published AIMS Press 01.05.2025
Subjects
Online AccessGet full text
ISSN2473-6988
2473-6988
DOI10.3934/math.2025493

Cover

More Information
Summary:Radial basis function neural networks (RBFNNs) of Hankel translates of order $ \mu > -1/2 $ with varying widths whose activation function $ \sigma $ is a.e. continuous, such that $ z^{-\mu-1/2}\sigma(z) $ is locally essentially bounded and not an even polynomial, are shown to enjoy the universal approximation property (UAP) in appropriate spaces of continuous and integrable functions. In this way, the requirement that $ \sigma $ be continuous for this kind of networks to achieve the UAP is weakened, and some results that hold true for RBFNNs of standard translates are extended to RBFNNs of Hankel translates.
ISSN:2473-6988
2473-6988
DOI:10.3934/math.2025493