Adaptive Algorithm Performance Summary

Access Full Text

Adaptive Algorithm Performance Summary

For access to this article, please select a purchase option:

Buy chapter PDF
£10.00
(plus tax if applicable)
Buy Knowledge Pack
10 chapters for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Introduction to Adaptive Arrays — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Author(s): Robert A. Monzingo ; Randy L. Haupt ; Thomas W. Miller
Source: Introduction to Adaptive Arrays,2011
Publication date January 2011

Chapters 4 through 9 considered the transient response characteristics and implementation considerations associated with different classes of algorithms that are widely used for adaptive array applications. This chapter summarizes the principal characteristics of each algorithm class before considering some practical problems associated with adaptive array system design. In each chapter of Part 2 the convergence speed of an algorithm representing a distinct adaptation philosophy was compared with the convergence speed of the least mean squares (LMS) algorithm. The convergence speeds of the various algorithms are compared for a selected example in this chapter. Since the misadjustment versus rate of adaptation tradeoffs for the random search algorithms-linear random search (LRS), accelerated random search (ARS), and guided accelerated random search (GARS) - and for the differential steepest descent (DSD) algorithm of Chapter 4 are unfavorable compared with the LMS algorithm, recourse to these methods would be taken only if the meager instrumentation required was regarded as a cardinal advantage or nonunimodal performance surfaces were of concern. Furthermore, the Howells-Applebaum maximum signal-to-noise ratio (SNR) algorithm has a misadjustment versus convergence speed trade-off that is nearly identical with the LMS algorithm.

Inspec keywords: random processes; adaptive signal processing; gradient methods; least mean squares methods; array signal processing

Other keywords: LRS algorithm; DSD algorithm; ARS algorithm; guided accelerated random search algorithm; linear random search algorithm; adaptive array system design; SNR; Howells-Applebaum maximum signal-to-noise ratio algorithm; nonunimodal performance surfaces; LMS algorithm; accelerated random search algorithm; differential steepest descent algorithm; distinct adaptation philosophy; least mean square algorithm; GARS algorithm

Subjects: Interpolation and function approximation (numerical analysis); Other topics in statistics; Other topics in statistics; Signal processing theory; Interpolation and function approximation (numerical analysis); Signal processing and detection

Preview this chapter:
Zoom in
Zoomout

Adaptive Algorithm Performance Summary, Page 1 of 2

| /docserver/preview/fulltext/books/ew/sbew046e/SBEW046E_ch9-1.gif /docserver/preview/fulltext/books/ew/sbew046e/SBEW046E_ch9-2.gif

Related content

content/books/10.1049/sbew046e_ch9
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading