Chebyshev polynomials are a set of orthogonal polynomials that are solutions
of a special kind of Sturm-Liouville differential equation called a Chebyshev
differential equation.
The equation is
Chebyshev polynomials can be of two kinds. In one dimension these
polynomials are defined as follows:
Polynomials of the first kind
Polynomials of the second kind
The roots of these polynomials are not equally spaced. Taguchi describes
a set of one-dimensional polynomials, which he calls Chebyshev, that
have equally spaced roots. When these equally spaced roots are assumed
to be the factor levels in an orthogonal array, a quadrature procedure
is available for approximating a response using Chebyshev polynomials
as individual terms.
In general, the quadrature method of fitting an approximation is more
efficient and stable compared to a regression-based approach. However,
the quadrature approach dictates that the function being approximated
be evaluated at pre-defined locations. For Chebyshev polynomials, these
positions correspond exactly to a sample obtained using an orthogonal
array.
The following equations show the Chebyshev polynomials with equally
spaced roots in one dimension:
where x is the average value of the levels. Taguchi generates
multivariate polynomials by taking products of Chebyshev polynomials
in each variable as listed above. Taguchi also provides tables for computing
the coefficients of these terms for an orthogonal array.
For example, suppose we have three variables , , and to which we
want to fit a response . We can generate the following
multivariate polynomial basis:
Therefore, the function is approximated as
Isight
uses Taguchi’s tables to calculate the coefficients
for orthogonal array sampling and least squares regression to calculate
all other sampling techniques. A term-by-term ANOVA can also be computed
for a Chebyshev polynomial approximation when orthogonal array sampling
is used.
Note:
All points in an orthogonal array have to be available
to use Taguchi’s method. Therefore, the error value obtained using
the cross-validation approach is no longer meaningful for orthogonal
arrays. When cross-validation is used, the sample set for fitting the
approximation no longer contains all the points. Therefore, Isight
uses the regression approach to build the approximations for cross validation,
whereas Isight
uses Taguchi’s approach for models with all the points.