Moments of distribution

luccacerf

New Member
Hi, im a little bit cofunsed about the how the first moment (k=1) equals to the mean.

i.e. k = 1 .: E[(y-m)^1] isnt E(y) - E(y) = 0 <> m ?
if you do k =2, goes perfectly to the variance formula.

k=2 .: E[(y-m)^2] = E(y^2) - [E(y)]^2

Could you guys help me please?

Thanks.
 

ShaktiRathore

Well-Known Member
Subscriber
Hi,
mean is first moment around 0(zero),thus mean=E[(y-0)^1] =E[y]
variance is the second moment around the mean= E[(y-m)^2] = E(y^2) - [E(y)]^2
thanks
 

David Harper CFA FRM

David Harper CFA FRM
Subscriber
@luccacerf I think Miller is a bit unclear on this but the central moment (aka, moment about the mean, as in centered on the mean) is given by k-th central moment = E[(X - µ)^k], such that the first central moment (k = 1) is given by E[(X - µ)^1] = E[X-µ]. Now an unbiased estimator, by definition, is when E(X) = µ, so if the estimator is unbiased then E[X-µ] = 0. In this way, I hadn't actually thought it is this way, but while the variance is the second central moment, the mean is actually the first raw moment (because the first central moment is zero by definition; see https://en.wikipedia.org/wiki/Moment_(mathematics). I guess while the central moment is about the mean, the raw moment is "about the zero" so to speak :rolleyes:

Because as k-th central moment = E[(X - µ)^k], the k-th raw moment = E[(X - 0)^k] = E[X^k], so the 1st raw moment = E[X] = mean. Thanks!
 
Last edited:
Top