On the convergence behavior of the LMS and the normalized LMS algorithms
It is shown that the normalized least mean square (NLMS) algorithm is a potentially faster converging algorithm compared to the LMS algorithm where the design of the adaptive filter is based on the usually quite limited knowledge of its input signal statistics. A very simple model for the input sign...
        Saved in:
      
    
          | Published in | IEEE transactions on signal processing Vol. 41; no. 9; pp. 2811 - 2825 | 
|---|---|
| Main Author | |
| Format | Journal Article | 
| Language | English | 
| Published | 
        New York, NY
          IEEE
    
        01.09.1993
     Institute of Electrical and Electronics Engineers  | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 1053-587X | 
| DOI | 10.1109/78.236504 | 
Cover
| Summary: | It is shown that the normalized least mean square (NLMS) algorithm is a potentially faster converging algorithm compared to the LMS algorithm where the design of the adaptive filter is based on the usually quite limited knowledge of its input signal statistics. A very simple model for the input signal vectors that greatly simplifies analysis of the convergence behavior of the LMS and NLMS algorithms is proposed. Using this model, answers can be obtained to questions for which no answers are currently available using other (perhaps more realistic) models. Examples are given to illustrate that even quantitatively, the answers obtained can be good approximations. It is emphasized that the convergence of the NLMS algorithm can be speeded up significantly by employing a time-varying step size. The optimal step-size sequence can be specified a priori for the case of a white input signal with arbitrary distribution.< > | 
|---|---|
| Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23  | 
| ISSN: | 1053-587X | 
| DOI: | 10.1109/78.236504 |