The steady-state of the (Normalized) LMS is schur convex

Khaled A. Al-Hujaili, Tareq Y. Al-Naffouri, Muhammad Moinuddin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations


In this work, we demonstrate how the theory of majorization and schur-convexity can be used to assess the impact of input-spread on the Mean Squares Error (MSE) performance of adaptive filters. First, we show that the concept of majorization can be utilized to measure the spread in input-regressors and subsequently order the input-regressors according to their spread. Second, we prove that the MSE of the Least Mean Squares Error (LMS) and Normalized LMS (NLMS) algorithms are schur-convex, that is, the MSE of the LMS and the NLMS algorithms preserve the majorization order of the inputs which provide an analytical justification to why and how much the MSE performance of the LMS and the NLMS algorithms deteriorate as the spread in input increases. © 2016 IEEE.
Original languageEnglish (US)
Title of host publication2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Number of pages5
ISBN (Print)9781479999880
StatePublished - Jun 24 2016


Dive into the research topics of 'The steady-state of the (Normalized) LMS is schur convex'. Together they form a unique fingerprint.

Cite this