P1.T2.220. OLS estimators in a multiple regression

David Harper CFA FRM

David Harper CFA FRM
Subscriber
AIMs: Define, calculate, and interpret measures of fit in multiple regression. Explain the concept of imperfect and perfect multicollinearity and their implications.

Questions:

220.1. Each of the following is true about the adjusted R^2 EXCEPT which is false?

a. Adjusted R^2 = 1 - (SSR/TSS)*[(n-1)/(n-k-1)]
b. Adding a regressor (independent variable) always causes the adjusted R^2 to decrease
c. Adjusted R^2 is always less than R^2
d. The adjusted R^2 can be negative

220.2. A multiple regression model, on a small sample of monthly returns for one year, has two regressors and is given by: Y(i) = 10.0 + 1.46*X(1,i) - 0.82*X(2,i) + u(i). The number of observations (n) is 12. The sum of squared residuals (SSR) is 106.0. The total sum of squares (TSS) is 166.0. What are, respectively, the standard error of the regression (SER) and the adjusted R^2?

a. SER = 0.89 and Adjusted R^2 = -0.11
b. SER = 2.25 and Adjusted R^2 = 0.64
c. SER = 3.43 and Adjusted R^2 = 0.22
d. SER = 11.87 and Adjusted R^2 = 0.64

220.3. With respect to a linear regression with multiple regressors, each of the following is true EXCEPT which statement is false:

a. Imperfect multicollinearity implies that we cannot estimate precisely ANY of the partial effects (slope coefficients)
b. Imperfect multicollinearity means that two or more of the regressors are highly correlated
c. In contrast to perfect muticollinearity, imperfect multicollinearity it is not necessarily an error but likely just a feature of the OLS
d. The dummy variable trap is an example of perfect multicolliinearity

Answers:
 
Top