Regression R^2 (coefficient determination)

RiskNoob

Active Member
Hi David and BT folks,

In Chapter 4, SW, the author states the following:

"Finally, the R^2 regression of Y on the single regressor X is the square of the correlation coefficient between Y and X."

There was no further explanation on this but it was not so trivial to me. :confused: There were couple of questions in PQ using the above facts. Could you help me to clarify on this statement?

Thanks,
 

David Harper CFA FRM

David Harper CFA FRM
Subscriber
Hi RiskNoob,

In a univariate regression, (one independent variable) it is always true that R^2 = r^2. Examples will bear this out.

Here is my reconciliation, which requires borrowing ingredients from Gujarati ... I doubt this is the most efficient reconciliation:

let:
xi = X(i) - average X
yi = Y(i) - average Y

OLS regression is such that, since yi = b*xi + e,
Σ(yi^2) = b^2*Σ(xi^2) + e^2; I sourced this step from Gujarati 7.35
and this implies that the slope^2 is given by:
b^2 = [ Σ(yi^2) - e^2 ] / Σ(xi^2)

Since b = covariance(x,y)/variance(x) = correlation(x,y)*volatility(y)/volatility(x), we can see that: correlation = r = b*StdDev(x)/StdDev(y)

let's square the correlation: r^2 = [b*StdDev(x)/StdDev(y)]^2 = b^2 * variance(x)/variance(y).
since variance(x) = Σ(xi^2)/n and variance(y) = Σ(yi^2)/n --> variance(x)/variance(y) = Σ(xi^2)/Σ(yi^2)

So r^2 = b^2 * Σ(xi^2)/Σ(yi^2). And now using the b^2 above:
r^2 = [ Σ(yi^2) - e^2 ] / Σ(xi^2) * Σ(xi^2)/Σ(yi^2)
r^2 = [ Σ(yi^2) - e^2 ] / Σ(yi^2) = 1 - e^2/ Σ(yi^2) = 1 - RSS/TSS = R^2
(Stock and Watson refer to a "sum of squared residuals" SSR instead of a residual sum of squares RSS; i.e., SSR and RSS are the same)

I'm not sure the proof is useful per se, you can see that I agree with you in the sense I cannot find a way to characterize the link as trivial when it comes to a proof ... but we can always trust that the correlation = SQRT(R^2) if there is only one independent. Thanks,
 

RiskNoob

Active Member
Thank you David for derivation of the correlation in terms of R^2. It makes sense now. Wish there were selection options for Like x 2 or Like x 3! :D
 

Aleksander Hansen

Well-Known Member
Here are some useful exercise [that is a known favorites for job interviews].

A friend of mine was given the following question, when interviewed by Societè Generale for an IBank position, and asked to solve it with pen and pencil, while also explaining what it "meant":
Prove that correlation is bounded between [-1,1]

Another simple question (that nevertheless a lot of people get wrong):
The R^2 from a linear regression is calculated to be 0.625. How much does the model explain?
 

Aleksander Hansen

Well-Known Member
Hi RiskNoob,

In a univariate regression, (one independent variable) it is always true that R^2 = r^2. Examples will bear this out.

Here is my reconciliation, which requires borrowing ingredients from Gujarati ... I doubt this is the most efficient reconciliation:

let:
xi = X(i) - average X
yi = Y(i) - average Y

OLS regression is such that, since yi = b*xi + e,
Σ(yi^2) = b^2*Σ(xi^2) + e^2; I sourced this step from Gujarati 7.35
and this implies that the slope^2 is given by:
b^2 = [ Σ(yi^2) - e^2 ] / Σ(xi^2)

Since b = covariance(x,y)/variance(x) = correlation(x,y)*volatility(y)/volatility(x), we can see that: correlation = r = b*StdDev(x)/StdDev(y)

let's square the correlation: r^2 = [b*StdDev(x)/StdDev(y)]^2 = b^2 * variance(x)/variance(y).
since variance(x) = Σ(xi^2)/n and variance(y) = Σ(yi^2)/n --> variance(x)/variance(y) = Σ(xi^2)/Σ(yi^2)

So r^2 = b^2 * Σ(xi^2)/Σ(yi^2). And now using the b^2 above:
r^2 = [ Σ(yi^2) - e^2 ] / Σ(xi^2) * Σ(xi^2)/Σ(yi^2)
r^2 = [ Σ(yi^2) - e^2 ] / Σ(yi^2) = 1 - e^2/ Σ(yi^2) = 1 - RSS/TSS = R^2
(Stock and Watson refer to a "sum of squared residuals" SSR instead of a residual sum of squares RSS; i.e., SSR and RSS are the same)

I'm not sure the proof is useful per se, you can see that I agree with you in the sense I cannot find a way to characterize the link as trivial when it comes to a proof ... but we can always trust that the correlation = SQRT(R^2) if there is only one independent. Thanks,

For real-life applications, you should also be able to solve the minimization problem using matrix notation, with an intercept and at least one explanatory variable.
 

RiskNoob

Active Member
Here are some useful exercise [that is a known favorites for job interviews].

A friend of mine was given the following question, when interviewed by Societè Generale for an IBank position, and asked to solve it with pen and pencil, while also explaining what it "meant":
Prove that correlation is bounded between [0,1]

Another simple question (that nevertheless a lot of people get wrong):
The R^2 from a linear regression is calculated to be 0.625. How much does the model explain?

Hi Aleks,

As for the second one with the intermediate value, it can be said that around 62.5% of the variation of dependent variable can be explained by the independent variable, and the remaining percentage can be explained by the error term. Wonder whether I fall into the trap?

As for the first one, I could not able to prove the boundness (from the basic definition) in the first try. Still trying to polish up my rusty analytic skills (Need to wake up for work tomorrow in few hours!). Maybe I will give another try tomorrow.

RiskNoob
 

RiskNoob

Active Member
Thanks for the big hint, and here is my second try:

From a vector space of random variables, and it can be shown that E(XY) is an inner product. And the norm of X is defined so that norm(X) = sqrt (E(X^2)).

Then by CS inequality says:
| E(XY) | <= sqrt(E(X^2))sqrt(E(Y^2))
Specifically,
| E[ (X - mu_x)(Y – mu_y)] | <= sqrt(E(X – mu_x)^2)sqrt(E(Y - mu_y)^2) = stdev(X)stdev(Y) by CS inequality.

Thus,

|p| = |cov(X, Y)|/stdev(X)stdev(Y) = | E[ (X - mu_x)(Y – mu_y)] |/ stdev(X)stdev(Y)
<= stdev(X)stdev(Y)/stdev(X)stdev(Y) = 1

I searched wiki to review some definitions in linear algebra, it was a nice exercise to polish up. Thanks Aleks. :)
 
Top