Gradient-Based Algorithms

Gradient-Based Algorithms

For access to this article, please select a purchase option:

Buy chapter PDF
(plus tax if applicable)
Buy Knowledge Pack
10 chapters for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
Introduction to Adaptive Arrays — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Gradient algorithms are popular because they are simple, easy to understand, and solve a large class of problems. The performance and adaptive weights determine the nature of the performance surface. When performance is a quadratic function of the weight settings, then it is a bowl-shaped surface with a minimum at the 'bottom of the bowl.' In this case, local optimization methods, such as gradient methods, can find the bottom. In the event that the performance surface is irregular, having several relative optima or saddle points, then the transient response of the gradient-based minimum-seeking algorithms get stuck in a local minimum. The gradient-based algorithms considered in this chapter are as follows: least mean square (LMS); Howells-Applebaum loop; differential steepest descent (DSD); accelerated gradient (AG); and steepest descent for power minimization.

Chapter Contents:

  • 4.1 Introductory Concepts
  • 4.2 The LMS Algorithm
  • 4.3 The Howells-Applebaum Adaptive Processor
  • 4.4 Introduction of Main Beam Constraints
  • 4.5 Constraint for the Case of Known Desired Signal Power Level
  • 4.6 The DSD Algorithm
  • 4.7 The Accelerated Gradient Approach (AG)
  • 4.8 Gradient Algorithm with Constraints
  • 4.9 Simulation Results
  • 4.10 Phase-Only Adaptive Nulling Using Steepest Descent
  • 4.11 Summary and Conclusions
  • 4.12 Problems
  • 4.13 References

Inspec keywords: antenna theory; adaptive antenna arrays; gradient methods

Other keywords: performance surface; local optimization methods; differential steepest descent; transient response; accelerated gradient; power minimization; adaptive weight; least mean square; gradient based minimum seeking algorithms; bottom of the bowl; adaptive arrays; Howells-Applebaum loop; gradient based algorithms

Subjects: Optimisation techniques; Antenna arrays; Antenna theory

Preview this chapter:
Zoom in

Gradient-Based Algorithms, Page 1 of 2

| /docserver/preview/fulltext/books/ew/sbew046e/SBEW046E_ch4-1.gif /docserver/preview/fulltext/books/ew/sbew046e/SBEW046E_ch4-2.gif

Related content

This is a required field
Please enter a valid email address