afterworkguinness
Active Member
I am probably missing something obvious here, but I can't see it. My notes say t statistic is calculated as (xbar - mu)/[s/sqrt(n)] .. but when doing hypothesis tests for linear regression for the significance of an intercept I've seen it calculated as intercept/standard error. This may be a dumb question but, why the difference ?