NLMS algorithm with decreasing step size for adaptive IIR filters

In this paper, we modify the GLMS algorithm to become the proposed LMS-SAS algorithm, which can more effectively converge to the global minimum of the mean-square output error (MSE) objective function. We also derive the infinite impulse response-normalized least-squares algorithm (IIR-NLMS), whose...

Full description

Saved in:
Bibliographic Details
Published inSignal processing Vol. 82; no. 10; pp. 1305 - 1316
Main Author Lai, Ching-An
Format Journal Article
LanguageEnglish
Published Amsterdam Elsevier B.V 01.10.2002
Elsevier Science
Subjects
Online AccessGet full text
ISSN0165-1684
1872-7557
DOI10.1016/S0165-1684(02)00275-X

Cover

More Information
Summary:In this paper, we modify the GLMS algorithm to become the proposed LMS-SAS algorithm, which can more effectively converge to the global minimum of the mean-square output error (MSE) objective function. We also derive the infinite impulse response-normalized least-squares algorithm (IIR-NLMS), whose behavior is similar to the LMS-SAS algorithm. The GLMS algorithm achieves its global search capability by appending a random perturbing noise to the LMS algorithm. Similarly, we suggest that such perturbing noise is to be multiplied by its MSE objective function in the proposed LMS-SAS algorithm. For the NLMS algorithm, we use the gradient estimation error, which exists naturally in the adaptive process, to act as perturbing noise. We have shown, theoretically and experimentally, that the LMS-SAS and NLMS algorithm do converge to the global minimum.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0165-1684
1872-7557
DOI:10.1016/S0165-1684(02)00275-X