# College Math Teaching

## February 18, 2019

### An easy fact about least squares linear regression that I overlooked

The background: I was making notes about the ANOVA table for “least squares” linear regression and reviewing how to derive the “sum of squares” equality:

Total Sum of Squares = Sum of Squares Regression + Sum of Squares Error or…

If $y_i$ is the observed response, $\bar{y}$ the sample mean of the responses, and $\hat{y}_i$ are the responses predicted by the best fit line (simple linear regression here) then: $\sum (y_i - \bar{y})^2 = \sum (\hat{y}_i -\bar{y})^2+ \sum (y_i - \hat{y}_i)^2$ (where each sum is $\sum^n_{i=1}$ for the n observations. )

Now for each $i$ it is easy to see that $(y_i - \bar{y}) = (\hat{y}_i -\bar{y}) + (y_i - \hat{y}_i)$ but the equations still holds if when these terms are squared, provided you sum them up!

And it was going over the derivation of this that reminded me about an important fact about least squares that I had overlooked when I first presented it.

If you go in to the derivation and calculate: $\sum ( (\hat{y}_i -\bar{y}) + (y_i - \hat{y}_i))^2 = \sum ((\hat{y}_i -\bar{y})^2 + (y_i - \hat{y}_i)^2 +2 (\hat{y}_i -\bar{y})(y_i - \hat{y}_i))$

Which equals $\sum ((\hat{y}_i -\bar{y})^2 + (y_i - \hat{y}_i)^2 + 2\sum (\hat{y}_i -\bar{y})(y_i - \hat{y}_i))$ and the proof is completed by showing that: $\sum (\hat{y}_i -\bar{y})(y_i - \hat{y}_i)) = \sum (\hat{y}_i)(y_i - \hat{y}_i)) - \sum (\bar{y})(y_i - \hat{y}_i))$ and that BOTH of these sums are zero.

But why?

Let’s go back to how the least squares equations were derived:

Given that $\hat{y}_i = \hat{\beta}_0 + \hat{\beta}_1 x_i$ $\frac{\partial}{\partial \hat{\beta}_0} \sum (\hat{y}_i -y_i)^2 = 2\sum (\hat{y}_i -y_i) =0$ yields that $\sum (\hat{y}_i -y_i) =0$. That is, under the least squares equations, the sum of the residuals is zero.

Now $\frac{\partial}{\partial \hat{\beta}_1} \sum (\hat{y}_i -y_i)^2 = 2\sum x_i(\hat{y}_i -y_i) =0$ which yields that $\sum x_i(\hat{y}_i -y_i) =0$

That is, the sum of the residuals, weighted by the corresponding x values (inputs) is also zero. Note: this holds with multilinear regreassion as well.

Really, that is what the least squares process does: it sets the sum of the residuals and the sum of the weighted residuals equal to zero.

Yes, there is a linear algebra formulation of this.

Anyhow returning to our sum: $\sum (\bar{y})(y_i - \hat{y}_i)) = (\bar{y})\sum(y_i - \hat{y}_i)) = 0$ Now for the other term: $\sum (\hat{y}_i)(y_i - \hat{y}_i)) = \sum (\hat{\beta}_0+\hat{\beta}_1 x_i)(y_i - \hat{y}_i)) = \hat{\beta}_0\sum (y_i - \hat{y}_i) + \hat{\beta}_1 \sum x_i (y_i - \hat{y}_i))$

Now $\hat{\beta}_0\sum (y_i - \hat{y}_i) = 0$ as it is a constant multiple of the sum of residuals and $\hat{\beta}_1 \sum x_i (y_i - \hat{y}_i)) = 0$ as it is a constant multiple of the weighted sum of residuals..weighted by the $x_i$.

That was pretty easy, wasn’t it?

But the role that the basic least squares equations played in this derivation went right over my head!