Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

Appendix A: Least Squares Polynomials and Data Fitting

Appendix A: Least Squares Polynomials and Data Fitting

For access to this article, please select a purchase option:

Buy chapter PDF
£10.00
(plus tax if applicable)
Buy Knowledge Pack
10 chapters for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Sensors, Actuators, and their Interfaces: A Multidisciplinary Introduction — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Least square polynomials or polynomial regression is a method of fitting a polynomial to a set of data. Suppose we have a set of n points (xi, yi) to which we wish to fit polynomial of the form y(x) = a0 + a1x + a2x2 + .... amxm. (A.1) Passing a polynomial through a set of data means selection of the coefficients so as minimize, in a global sense, the distance between the value of the function y(x) and the values at the points y(xi). This is done through the least squares method by first writing the “distance”function: n XS = i=1 (yi - a0 - a1xi - a2xi2 - .... amxim)2. (A.2) To minimize this function we calculate the partial derivatives with respect to each unknown coefficient and set it to zero. For the kth coefficient (k = 0, 1, 2, ... , m) we write aS aak or n X= -2 xik(yi - a0 - a1xi - a2xi2 - .... amxim) = 0 (A.3) i=1 n Xxik(yi - a0 - a1xi - a2xi2 - .... amxim) = 0. (A.4) i=1 Repeating this for all m coefficients results in m equations from which the coefficients through am can be evaluated. We show here how to derive the coefficients for a first order (linear) and second-order (quadratic) polynomial least square fit since these are the most commonly used forms. We assume n data points (xi, yi) as above.

Chapter Contents:

  • Appendix A

Inspec keywords: least squares approximations; polynomial approximation; regression analysis; data handling; computational complexity

Other keywords: data fitting; least squares polynomials; least squares method; polynomial regression; polynomial least square fit; data set; second order quadratic polynomial least square; first order linear quadratic polynomial least square; data means selection; distance function

Subjects: Numerical approximation and analysis; Other topics in statistics; Interpolation and function approximation (numerical analysis); Probability theory, stochastic processes, and statistics; Computational complexity; Interpolation and function approximation (numerical analysis); Statistics; Numerical analysis; Other topics in statistics

Related content

content/books/10.1049/sbcs502e_appendixa
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address