Relationship between Sum of squared error and variance

Mayor15

New member
Joined
Jan 12, 2015
Messages
2
Hello,

Please what is the relationship between Sum of squared error and variance? Is it possible to scale up the Sum of squared error to a factor, say 10, and does it automatically mean the variance too has to be scaled up with the same value, 10? If this is possible, what mathematical basis can be used to back it up?

I crave for your helpful reply. Many Thanks.:)
 
Hello,

Please what is the relationship between Sum of squared error and variance? Is it possible to scale up the Sum of squared error to a factor, say 10, and does it automatically mean the variance too has to be scaled up with the same value, 10? If this is possible, what mathematical basis can be used to back it up?

I crave for your helpful reply. Many Thanks.:)

In a discrete system, the variance is the unbiased normalized sum of squares of variances from the mean [SS]. That is, let u be the mean of the discrete population {yi, i = 1, 2, 3, ..., N} and s the variance. Then
s = \(\displaystyle \frac{\Sigma (y_i - u)^2}{N-1}\)

So the answer to your question is yes, if the population is scaled so that SS is scaled by a factor, then the variance is scaled by the same factor.

Note that in, for example regression analysis, those (yi - u) may actually be deviation from a regression line, i.e. a sum of squared error. The N-1 (instead of N) is what makes the estimate unbiased and is because one degree of freedom has already been used.
 
Top