Hi,
mean is first moment around 0(zero),thus mean=E[(y-0)^1] =E[y]
variance is the second moment around the mean= E[(y-m)^2] = E(y^2) - [E(y)]^2
thanks
@luccacerf I think Miller is a bit unclear on this but the central moment (aka, moment about the mean, as in centered on the mean) is given by k-th central moment = E[(X - µ)^k], such that the first central moment (k = 1) is given by E[(X - µ)^1] = E[X-µ]. Now an unbiased estimator, by definition, is when E(X) = µ, so if the estimator is unbiased then E[X-µ] = 0. In this way, I hadn't actually thought it is this way, but while the variance is the second central moment, the mean is actually the first raw moment (because the first central moment is zero by definition; see https://en.wikipedia.org/wiki/Moment_(mathematics). I guess while the central moment is about the mean, the raw moment is "about the zero" so to speak
Because as k-th central moment = E[(X - µ)^k], the k-th raw moment = E[(X - 0)^k] = E[X^k], so the 1st raw moment = E[X] = mean. Thanks!
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.