T stat in linear regression

afterworkguinness

Active Member
I am probably missing something obvious here, but I can't see it. My notes say t statistic is calculated as (xbar - mu)/[s/sqrt(n)] .. but when doing hypothesis tests for linear regression for the significance of an intercept I've seen it calculated as intercept/standard error. This may be a dumb question but, why the difference ?
 

ShaktiRathore

Well-Known Member
Subscriber
In regression analysis, we need to find that whether the intercept is significantly different from zero. This is a null hypothesis test to test whether intercept equals 0. H0: intercept== and Ha:intercept!=0. we calculate the t statistic to see how much standard deviations the value of intercept is away from 0(just as we find the usual t statistic to find how much standard deviations the value of xbar is away from mu).we standardize the deviation with sample standard deviation to get the value of t statistic. So larger the value of t-statistic this means the intercept is far away from the value of 0 and thus we reject null hypothesis and accept Ha that t-stat has value different from 0 and is significant. On the other side lower t stat value suggest the intercept closer to 0 and thus accept H0 that intercept is approx. 0 and this tells insignificance of intercept in the regression. the same goes for slope coefficients in the regression also. I hope u understood.

thanks
 

Bryon

Member
Just to add that you can actually test intercept/coefficients against other values not just zero. So the standard t-stat formula for testing the mean remains the same. s/sqrt(n) is also consistent with standard error SE notation.
 
Top