Normalized LMS algorithm and data-selective strategies for adaptive graph signal estimation
•A graph signal processing (GSP) normalized least-mean-squares (NLMS) adaptive algorithm for online estimation of bandlimited graph signals (GS) from a reduced number of noisy measurements is proposed.•It is derived a range of values for the underlying convergence factor that guarantee the algorithm...
Saved in:
| Published in | Signal processing Vol. 167; p. 107326 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
Elsevier B.V
01.02.2020
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 0165-1684 1872-7557 1872-7557 |
| DOI | 10.1016/j.sigpro.2019.107326 |
Cover
| Summary: | •A graph signal processing (GSP) normalized least-mean-squares (NLMS) adaptive algorithm for online estimation of bandlimited graph signals (GS) from a reduced number of noisy measurements is proposed.•It is derived a range of values for the underlying convergence factor that guarantee the algorithm stability and ability to provide asymptotically unbiased GS estimates.•It is derived closed-form expressions for estimating the mean-squared error (MSE) and mean-squared deviation (MSD) of the resulting online estimator.•The application of data-selective (DS) schemes for GSP adaptive filtering is proposed along with two strategies for data novelty tests: one based on the individual estimation error component values, namely the component-wise error constraint strategy, and another that uses the vector squared ‘2-norm of the estimation error as a reference, the so-called l2-norm error constraint strategy.•It is show how to set up the DS constraint parameters that allow an accurate estimate of the update probability when using not only the proposed GSP NLMS, but also the GSP LMS and RLS algorithms.
This work proposes a normalized least-mean-squares (NLMS) algorithm for online estimation of bandlimited graph signals (GS) using a reduced number of noisy measurements. As in the classical adaptive filtering framework, the resulting GS estimation technique converges faster than the least-mean-squares (LMS) algorithm while being less complex than the recursive least-squares (RLS) algorithm, both recently recast as adaptive estimation strategies for the GS framework. Detailed steady-state mean-squared error and deviation analyses are provided for the proposed NLMS algorithm, and are also employed to complement previous analyses on the LMS and RLS algorithms. Additionally, two different time-domain data-selective (DS) strategies are proposed to reduce the overall computational complexity by only performing updates when the input signal brings enough innovation. The parameter setting of the algorithms is performed based on the analysis of these DS strategies, and closed formulas are derived for an accurate evaluation of the update probability when using different adaptive algorithms. The theoretical results predicted in this work are corroborated with high accuracy by numerical simulations. |
|---|---|
| ISSN: | 0165-1684 1872-7557 1872-7557 |
| DOI: | 10.1016/j.sigpro.2019.107326 |