Least square polynomials or polynomial regression is a method of fitting a polynomial to a set of data. Suppose we have a set of n points (xi, yi) to which we wish to fit polynomial of the form y(x) = a0 + a1x + a2x2 + .... amxm. (A.1) Passing a polynomial through a set of data means selection of the coefficients so as minimize, in a global sense, the distance between the value of the function y(x) and the values at the points y(xi). This is done through the least squares method by first writing the “distance”function: n XS = i=1 (yi - a0 - a1xi - a2xi2 - .... amxim)2. (A.2) To minimize this function we calculate the partial derivatives with respect to each unknown coefficient and set it to zero. For the kth coefficient (k = 0, 1, 2, ... , m) we write aS aak or n X= -2 xik(yi - a0 - a1xi - a2xi2 - .... amxim) = 0 (A.3) i=1 n Xxik(yi - a0 - a1xi - a2xi2 - .... amxim) = 0. (A.4) i=1 Repeating this for all m coefficients results in m equations from which the coefficients through am can be evaluated. We show here how to derive the coefficients for a first order (linear) and second-order (quadratic) polynomial least square fit since these are the most commonly used forms. We assume n data points (xi, yi) as above.