ESS

If you have, say, 20 observations in your regression, ESS is the sum of twenty squared differences. Each squared difference is: the square of the difference beteen the regression line (the "predicted" Y) and the average Y (always a flat line). So, ESS = sum of [predicted Y - average Y]^2. A silly extreme: if the regression line is flat, then X explains Y not at all and ESS = 0.

Each of ESS, RSS, TSS is the summation of n items where n is the number of observations. And the "items" are squared differences, where TSS is just the combination of ESS and RSS (see Gujarati p 186, it is best to visualize the vertical line from the observed Y to the average Y, the full line is the TSS component. Then the regression lines serves to "slice into two pieces" the TSS into an ESS and an RSS piece. These are the ingredients in the ANOVA table)

David
 
oh, i see. That would be hard to intuit, the differences are in the small x, y and the squares are "hidden" in there via the params b1, b2. But as 8.34 is a manipulation to get to R^2 via ESS, I don't immediately see the intuition. Frankly, this expression of ESS will *not* be tested...advise you focus on the more basic relationship between ESS, RSS, TSS. In fact, 8.34 to 8.36 are far less important (Gujarati is here merely "on the way" to showing the math for multiple R^2) than what surrounds them...David
 
Top