
Software: 
The method of LEAST SQUARES is used. Linear Regression To
illustrate, if z = A_{0} + A_{1}*x
+ A_{2}*y is to
be fitted to N data points, the normal equations are A_{0}*N + A_{1}*åx + A_{2}*åy = åz A_{0}*åx + A_{1}*åx*x + A_{2}*åx*y = åx*z A_{0}*åy + A_{1}*åy*x + A_{2}*åy*y = åy*z
Solving the above system we get the values of the coefficients A_{0}, A_{1 }and A_{2}. Obviously, it may be generalized to any number of variables or parameters. The correlation coefficient is computed with
å
[f_{estimated}  f_{mean}]^{2
}SQR (
—————————————————————— ) The standard error of estimate is calculated using
å
[f  f_{estimated} ]^{2
}SQR (
————————————————————— ) Summations are over all N data points, f is the dependent variable, and the f_{estimated} is found using the regression results. Curvilinear Regression If we wish to fit the curve F(W) = A_{0}*G(x,y,z) + A_{1}*H(x,y,z) + A_{2}*J(x,y,z) to N data points then the normal equations are A_{0}*å G*G + A_{1}*å G*H + A_{2}*å G*J = å G*F A_{0}*å H*G + A_{1}*å H*H + A_{2}*å H*J = å H*F A_{0}*å J*G + A_{1}*å J*H + A_{2}*å J*J = å J*F where the summations are over all N data points. Solving the above system we get the values of the coefficients A_{0}, A_{1 } and A_{2}. Obviously, it may be generalized to any number of variables or parameters. The correlation coefficient and the standard error of estimate are calculated as in LINEAR REGRESSION. General Regression To illustrate the general method, let's say we want to fit a curve with three parameters, A, B and C, to N data points, that is, the curve F(A,B,C,x) where x standS for any number of variables. If A_{0}, B_{0} and C_{0} are sufficiently close approximations to A, B and C, then we can use Taylor's theorem to put F(A,B,C,x) ÷ F(A_{0},B_{0},C_{0},x) + (AA_{0})*Fa + (BB_{0})*Fb + (CC_{0})*Fc where we ignored terms of order higher than the first and where
¶F
¶F
¶F Now, let's call the residuals R = F(A_{0},B_{0},C_{0},x)  F(A,B,C,x) If we
minimize the residuals using the method of least squares, we obtain the
normal equations (AA_{0})*å Fb*Fa + (BB_{0})*å Fb*Fb + (CC_{0})*å Fb*Fc =  å Fb*R (AA_{0})*å Fc*Fa + (BB_{0})*å Fc*Fb + (CC_{0})*å Fc*Fc =  å Fc*R where Fa, Fb and Fc are as defined before and the summations are over all N data points. If we solve this last set of linear equations for (AA_{0}), (BB_{0}) and (CC_{0}), we obtain corrections on the initial approximation, that is, if the solutions to the system are DA, DB and DC, where AA_{0}=DA ; BB_{0}=DB ; CC_{0}=DC then a closer approximation to the solution is A_{1}=A_{0}+DA ; B_{1}=B_{0}+DB ; C_{1}=C_{0}+DC We may now use A_{1}, B_{1} and C_{1} to again set a system of linear equations and solve it to obtain A_{2}, B_{2} and C_{2}. This is done until all parameters in the current approximation in the sequence differ from those of the previous approximation by less than their specified error and the change in R² is less than its specified error. Of course, the method may be generalized to any number of variables or parameters. The correlation coefficient and the standard error of estimate are calculated as in LINEAR REGRESSION. 

Copyright © 20012010 Numerical Mathematics. All rights reserved.
