Tuesday, May 14, 2013

LM fitting


A data set has values yi, each of which has an associated modelled value fi (also sometimes referred to as ŷi). Here, the values yi are called the observed values and the modelled values fi are sometimes called the predicted values.
The "variability" of the data set is measured through different sums of squares:
SS_\text{tot}=\sum_i (y_i-\bar{y})^2, the total sum of squares (proportional to the sample variance);
SS_\text{reg}=\sum_i (f_i -\bar{y})^2, the regression sum of squares, also called the explained sum of squares.
SS_\text{err}=\sum_i (y_i - f_i)^2\,, the sum of squares of residuals, also called the residual sum of squares.
In the above \bar{y} is the mean of the observed data:
\bar{y}=\frac{1}{n}\sum_{i=1}^n y_i
where n is the number of observations.
The notations SS_{R} and SS_{E} should be avoided, since in some texts their meaning is reversed to Residual sum of squares and Explained sum of squares, respectively.
The most general definition of the coefficient of determination is
R^2 \equiv 1 - {SS_{\rm err}\over SS_{\rm tot}}.\,

No comments:

Post a Comment