A new class of self‐normalising LMS algorithms

Many researchers and practitioners make heavy use of the least mean squares (LMS) algorithm as an efficient adaptive filter suitable for a multitude of problems. Despite being versatile and efficient, a drawback of this algorithm is that the adaptation rate, i.e. step‐size, has to be chosen very car...

Full description

Saved in:
Bibliographic Details
Published inElectronics letters Vol. 58; no. 12; pp. 492 - 494
Main Authors Ploder, Oliver, Lang, Oliver, Paireder, Thomas, Motz, Christian, Huemer, Mario
Format Journal Article
LanguageEnglish
Published Stevenage John Wiley & Sons, Inc 01.06.2022
Wiley
Subjects
Online AccessGet full text
ISSN0013-5194
1350-911X
1350-911X
DOI10.1049/ell2.12498

Cover

More Information
Summary:Many researchers and practitioners make heavy use of the least mean squares (LMS) algorithm as an efficient adaptive filter suitable for a multitude of problems. Despite being versatile and efficient, a drawback of this algorithm is that the adaptation rate, i.e. step‐size, has to be chosen very carefully in order to get the desired result (optimum compromise between fast adaptation and low steady state error). This choice was simplified by the invention of the normalised LMS, which bounds the step‐size and guarantees convergence. However, the optimum choice of the normalisation becomes non‐trivial if the system to be approximated is part of a bigger, non‐trivial model, e.g. cascaded filters or linear paths followed by nonlinearities. Such cases usually require approximations or worst‐case estimates in order to yield a normalised update algorithm, which might result in sub‐optimal performance. To counteract this problem, a new class of LMS algorithms which automatically choose their own normalisation terms, the so‐called self normalising LMS, is introduced. The simulations show that this new algorithm not only outperforms state‐of‐the‐art solutions in terms of steady state performance in a cascaded filter scenario but also converges just as fast as all other considered algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0013-5194
1350-911X
1350-911X
DOI:10.1049/ell2.12498