Coefficient of Correlation in Multiple Regression Analysis?

Shazam023

New Member
@David Harper CFA FRM CIPM
In one of your videos about Multiple Regression Analysis, u said that likewise in Univariate Reg. analysis (rho)= √R^2, it is not possible in multiple regression analysis.
Can u explain/ give reasoning for this??
 

Dr. Jayanthi Sankaran

Well-Known Member
Hi @Shazam023,

I am not sure that I understand your question right. Nevertheless, just giving it a shot:

In Multiple Regression Analysis

(1) An increase in the R^2 does not necessarily mean that an added variable is statistically significant: To ascertain whether an added variable is statistically significant, you need to perform a hypothesis test using the t-statistic
(2) A high R^2 does not mean that the regressors are a true cause of the dependent variable: Relationship may not be causal
(3) A high R^2 does not mean that there is no omitted variable bias: Omitted variable bias can occur in regressions with low R^2, a moderate R^2 or a high R^2. Conversely, a low R^2 does not imply that there is necessarily an omitted variable bias
(4) A high R^2 does not necessarily mean you have the most appropriate set of regressors, nor does a low R^2 necessarily mean you have an inappropriate set of regressors: Decisions about the regressors must weigh issues of omitted variable bias, data availability, data quality, and most importantly, economic theory and the nature of the substantive questions being answered

Hope that helps:)
Thanks!
Jayanthi
 

Deepak Chitnis

Active Member
Subscriber
Hi @Shazam023, Customarily, the degree to which two or more predictors (independent or X variables) are related to the dependent (Y) variable is expressed in the correlation coefficient R, which is the square root of R-square. In multiple regression, Rcan assume values between 0 and 1. To interpret the direction of the relationship between variables, look at the signs (plus or minus) of the regression or B coefficients. If a B coefficient is positive, then the relationship of this variable with the dependent variable is positive (e.g., the greater the IQ the better the grade point average); if the Bcoefficient is negative then the relationship is negative (e.g., the lower the class size the better the average test scores). Of course, if the B coefficient is equal to 0 then there is no relationship between the variables. Hope that helps:)!
Thank you,
 

Shazam023

New Member
@Deepak Chitnis i understand what u r trying to say, but my point is R= SQRT (R^2) for Single Variable Regression. how will u calculate R (rho) when R^2 is given for Multiple Regression Line. wud u still do SQRT (R^2) ?? whch is what m asking from @David Harper CFA FRM CIPM . he mentioned in the video that we cant do that. so i am looking for a reason behind tht statement
 

Dr. Jayanthi Sankaran

Well-Known Member
Hi @Shazam023,

I think this has something to do with perfect multicollinearity between the regressors, where one regressor can be written as a perfect linear function of the other regressors. Because of this, it is impossible to compute the OLS estimates of the regression and therefore the R = SQRT(R^2) cannot be used in Multiple Regression...However, this can be sorted out using the assumption of imperfect multicollinearity which does not pose a problem for the theory of OLS estimators. Purpose of OLS is to sort out the independent influences of the various regressors when these are potentially correlated.

Thanks:rolleyes:
Jayanthi
 
Last edited:

David Harper CFA FRM

David Harper CFA FRM
Subscriber
I think @Deepak Chitnis is asking: in a univariate regression the coefficient of determination is equal to the square of the correlation coefficient; i.e., R^2 = r^2, so you can retrieve correlation with SQRT(R^2) or you can retrieve the R^2 by squaring the correlation. But I'm sure what I said in the video is: in a multivariate regression, it's not true that R^2 = r^2. The question is can you retrieve the correlation (i.e., r) from the given multivariate R^2. Without taking a closer look, I don't see the obvious math. You can "unlever" an multivariate adjusted R^2 into a multivariate R^2, but taking the square root still give you R (aka, coefficient of multivariate regression) not r (pearson correlation coefficient). The values are all present so it seems like it should be solvable, i just don't see it easily sorry. Thanks,
 

Shazam023

New Member
S
I think @Deepak Chitnis is asking: in a univariate regression the coefficient of determination is equal to the square of the correlation coefficient; i.e., R^2 = r^2, so you can retrieve correlation with SQRT(R^2) or you can retrieve the R^2 by squaring the correlation. But I'm sure what I said in the video is: in a multivariate regression, it's not true that R^2 = r^2. The question is can you retrieve the correlation (i.e., r) from the given multivariate R^2. Without taking a closer look, I don't see the obvious math. You can "unlever" an multivariate adjusted R^2 into a multivariate R^2, but taking the square root still give you R (aka, coefficient of multivariate regression) not r (pearson correlation coefficient). The values are all present so it seems like it should be solvable, i just don't see it easily sorry. Thanks,

So we can't get (rho) from multiple regression analysis.. Is it??
Well my reasoning was that because Univariate has only variable X & Y so we can conclude their relationship with each other i.e (rho) whereas in multivariate regression we hv more thn one Xi so finding individual relation with Yi is easier but taking a combined correlation of Xi with Yi not possible because (rho) gives as info. About linear rationship of only 2 variables not for multiple variables. @David Harper CFA FRM CIPM
 
Top