Using the tools of nonlinear system theory, we examine several common nonlinear variants of the LMS algorithm and derive a persistence of excitation criteria for local exponential stability. The condition is tight when the inputs are periodic, and a generic counterexample is demonstrated which gives (local) instability for a large class of such nonlinear versions of LMS, specifically, those which utilize a nonlinear data function. The presence of a nonlinear error function is found to be relatively benign in that it does not affect the stability of the error system. Rather, it defines the cost function the algorithm tends to minimize. Specific examples include the dead zone modification, the cubed data nonlinearity, the cubed error nonlinearity, the signed regressor algorithm, and a single layer version of the backpropagation algorithm.
This study of the (mis)behavior of adaptive algorithms first appeared in the IEEE Trans. on Signal Processing, in September 1992.
To get to my homepage, click here.