Chebyshev Polynomial Curve fit

kepler

New member
Joined
Jul 14, 2015
Messages
26
Good evening,

Can please someone give an idea - a pratical and numerical example would be great - on how to do a Chebyshev curve fit to a set of data points (x,y)?

I found much information on how these polynomials are calculated, but I can't seem to find an alghorithm to apply the best fit. I don't want to fit a knonwed function. Just a set of data (being x between 0 and 1).

Any help will be apreciated.

Regards,

Kepler
 
Good evening,

Can please someone give an idea - a pratical and numerical example would be great - on how to do a Chebyshev curve fit to a set of data points (x,y)?

I found much information on how these polynomials are calculated, but I can't seem to find an alghorithm to apply the best fit. I don't want to fit a knonwed function. Just a set of data (being x between 0 and 1).

Any help will be apreciated.

Regards,

Kepler
The general idea of linear least squares would work well. The linear refers to the coefficients and not the functions being used for the fit. For '1 dimensional data', i.e. fitting a set of points (xj, yj), j=1, 2, 3, ..., m, where x and y are scalars we have: Given a set of functions fi(x) and a function
y(x) = an fn(x) + an-1 fn-1(x) + an-2 fn-2(x) + .... + a0 f0(x)
assume the measurements yj are given by
yj = y(xj) + ej
where the ej represents a 'measurement error'. If we define a function
E = E(an ,an-1 , an-2, ...., a0) = \(\displaystyle \Sigma_{j=1}^{j=m}\, e_j^2\, =\, \Sigma_{j=1}^{j=m}\, [a_n f_n(x_j) + a_{n-1} f_{n-1}(x_j) + a_{n-2} f_{n-2}(x_j) + .... + a_0 f_0(x_j)\, -\, y_j]^2\)
we can now E wrt an ,an-1 , an-2, ...., and a0 to get a system of equations which can be solved in the usual manner(s). That is set
\(\displaystyle \frac{\partial\, E}{\partial\, a_i}\, =\, 2\, \Sigma_{j=1}^{j=m}\, f_i(x_j)e_j\, =\, 0;\, i=1,2,3,...,n\)
to obtain n equations in n unknowns

In the normal polynomial fit, the fi = xi. In your case the fi would be the ith Chebyshev polynomial.

You might look at
http://www.extremeoptimization.com/Samples/CurveFitting.aspx
and/or do a web search on Chebyshev regression analysis

Oh, and extension to more dimensions is easy,
 
The general idea of linear least squares would work well. The linear refers to the coefficients and not the functions being used for the fit. For '1 dimensional data', i.e. fitting a set of points (xj, yj), j=1, 2, 3, ..., m, where x and y are scalars we have: Given a set of functions fi(x) and a function
y(x) = an fn(x) + an-1 fn-1(x) + an-2 fn-2(x) + .... + a0 f0(x)
assume the measurements yj are given by
yj = y(xj) + ej
where the ej represents a 'measurement error'. If we define a function
E = E(an ,an-1 , an-2, ...., a0) = \(\displaystyle \Sigma_{j=1}^{j=m}\, e_j^2\, =\, \Sigma_{j=1}^{j=m}\, [a_n f_n(x_j) + a_{n-1} f_{n-1}(x_j) + a_{n-2} f_{n-2}(x_j) + .... + a_0 f_0(x_j)\, -\, y_j]^2\)
we can now E wrt an ,an-1 , an-2, ...., and a0 to get a system of equations which can be solved in the usual manner(s). That is set
\(\displaystyle \frac{\partial\, E}{\partial\, a_i}\, =\, 2\, \Sigma_{j=1}^{j=m}\, f_i(x_j)e_j\, =\, 0;\, i=1,2,3,...,n\)
to obtain n equations in n unknowns

In the normal polynomial fit, the fi = xi. In your case the fi would be the ith Chebyshev polynomial.

You might look at
http://www.extremeoptimization.com/Samples/CurveFitting.aspx
and/or do a web search on Chebyshev regression analysis

Oh, and extension to more dimensions is easy,


Hi Ishuda,

Sorry for the inconvinience...

That means for instance that instead of computing:

- the sum of x^2 in the matrice, I compute the sum of (2 * x ^ 2 -1) - the T(2,x) chebyschev polynomial?
- the sum of x, I compute sum of x?
- the sum of x^i, I compute the sum of T(i,x)?

And where the T(0,x) = 1 polynomial fits in the equation?

Sorry for the trouble...

Kind regards,

Kepler
 
Hi Ishuda,

Sorry for the inconvinience...

That means for instance that instead of computing:

- the sum of x^2 in the matrice, I compute the sum of (2 * x ^ 2 -1) - the T(2,x) chebyschev polynomial?
- the sum of x, I compute sum of x?
- the sum of x^i, I compute the sum of T(i,x)?

And where the T(0,x) = 1 polynomial fits in the equation?

Sorry for the trouble...

Kind regards,

Kepler
I think you have the idea. Suppose you had two points, (x1, y1) and (x2, y2) and decided to do a fit to the first two Chebyshev polynomials, T0(x) and T1(x), i.e.
y(x) = a T0(x) + b T1(x)
Then your matrix equation would look like
\(\displaystyle \begin{pmatrix}
T_0^2(x_1) + T_0^2(x_2)&&T_0(x_1)T_1(x_1) + T_0(x_2)T_1(x_2)\\
T_0(x_1)T_1(x_1) + T_0(x_2)T_1(x_2)&& T_1^2(x_1) + T_1^2(x_2)
\end{pmatrix}
\begin{pmatrix} a\\b \end{pmatrix}
\, =\,
\begin{pmatrix}
T_0(x_1) y_1 + T_0(x_2) y_2\\
T_1(x_1) y_1 + T_1(x_2) y_2
\end{pmatrix}\)

You would then solve for a and b.
 
Hi Ishuda,

Thanks again. I don't know if I figured the system out... For 3 set of poins, are these the equations?

\begin{pmatrix}
T_0^2(x_1) + T_0^2(x_2) + T_0^2(x_3)&&T_0(x_1)T_1(x_1) + T_0(x_2)T_1(x_2) + T_0(x_3)T_2(x_3)&&T_0(x_1)T_2(x_1) + T_0(x_2)T_2(x_2) + T_0(x_3)T_2(x_3)\\
T_0(x_1)T_1(x_1) + T_0(x_2)T_1(x_2) + T_0(x_3)T_1(x_3)&&T_1^2(x_1) + T_1^2(x_2) + T_1^2(x_3)&&T_1(x_1)T_2(x_1) + T_1(x_2)T_2(x_2) + T_1(x_3)T_2(x_3)\\
T_0(x_1)T_2(x_1) + T_0(x_2)T_2(x_2) + T_0(x_3)T_2(x_3)&&T_1(x_1)T_2(x_1) + T_1(x_2)T_2(x_2) + T_1(x_3)T_2(x_3)&&T_2^2(x_1) + T_2^2(x_2) + T_2^2(x_3)
\end{pmatrix}
\begin{pmatrix} a\\b \end{pmatrix}
=
\begin{pmatrix}
T_0(x_1) y_1 + T_0(x_2) y_2 + T_0(x_3) y_3\\
T_1(x_1) y_1 + T_1(x_2) y_2 + T_1(x_3) y_3\\
T_2(x_1) y_1 + T_2(x_2) y_2 + T_2(x_3) y_3
\end{pmatrix}

Regards,

Kepler
 
Hi,

I've generalized, and got an alghorithm. For 5 points (n=5) for instance I get:

Code:
SUM[T0(x) . T0(x)]  SUM[T0(x) . T1(x)]  SUM[T0(x) . T2(x)]  SUM[T0(x) . T3(x)]  SUM[T0(x) . T4(x)]   = C(0) = SUM[y . T0(x)]
SUM[T1(x) . T0(x)]  SUM[T1(x) . T1(x)]  SUM[T1(x) . T2(x)]  SUM[T1(x) . T3(x)]  SUM[T1(x) . T4(x)]   = C(1) = SUM[y . T1(x)]
SUM[T2(x) . T0(x)]  SUM[T2(x) . T1(x)]  SUM[T2(x) . T2(x)]  SUM[T2(x) . T3(x)]  SUM[T2(x) . T4(x)]   = C(2) = SUM[y . T2(x)]
SUM[T3(x) . T0(x)]  SUM[T3(x) . T1(x)]  SUM[T3(x) . T2(x)]  SUM[T3(x) . T3(x)]  SUM[T3(x) . T4(x)]   = C(3) = SUM[y . T3(x)]
SUM[T4(x) . T0(x)]  SUM[T4(x) . T1(x)]  SUM[T4(x) . T2(x)]  SUM[T4(x) . T3(x)]  SUM[T4(x) . T4(x)]   = C(4) = SUM[y . T4(x)]

Tn is the nth chebyschev polynomial. C - the coefs. - go from 0 to n-1.

Is this right?

Regards,

Kepler
 
Top