Gradient algorithms are popular because they are simple, easy to understand, and solve a large class of problems. The performance and adaptive weights determine the nature of the performance surface. When performance is a quadratic function of the weight settings, then it is a bowl-shaped surface with a minimum at the 'bottom of the bowl.' In this case, local optimization methods, such as gradient methods, can find the bottom. In the event that the performance surface is irregular, having several relative optima or saddle points, then the transient response of the gradient-based minimum-seeking algorithms get stuck in a local minimum. The gradient-based algorithms considered in this chapter are as follows: least mean square (LMS); Howells-Applebaum loop; differential steepest descent (DSD); accelerated gradient (AG); and steepest descent for power minimization.
Gradient-Based Algorithms, Page 1 of 2
< Previous page Next page > /docserver/preview/fulltext/books/ew/sbew046e/SBEW046E_ch4-1.gif /docserver/preview/fulltext/books/ew/sbew046e/SBEW046E_ch4-2.gif