sailakshmisuresh
New Member
There was an example in GARP stating "Suppose that an asset price is $60 and that its daily volatility is 2%. This means that a one-standard devaition move in the asset price over one day would be 60*0.02 or 1.20%. If we assume taht the change in the asset price is normally distributed we can be 95% certain that the asset price will be between 60 - 1.96*1.2= $57.65 and 60 + 1.96*1.2 = $62.35."
We have studied that when building confidence intervals the standard error is multiplied with the respective critical value. In the example aforementioned since the standard deviation is 2% shouldn't we divide that by square root of n to get the standard deviation of the sampling distribution (standard error)- more like sampling distribtion of the sample standard deviation). So whatever might be the sample size shouldn't its square root be used to divide 2%. And I am unable to understand why is 2% multiplied by $60. There should be no need to multiply right since 0.02 is the standard deviation that we already found.
Can we write more like
60 +/- 1.96* (0.02/sq rt of n)
Would be grateful if someone can clarify this doubt!
Thanks
We have studied that when building confidence intervals the standard error is multiplied with the respective critical value. In the example aforementioned since the standard deviation is 2% shouldn't we divide that by square root of n to get the standard deviation of the sampling distribution (standard error)- more like sampling distribtion of the sample standard deviation). So whatever might be the sample size shouldn't its square root be used to divide 2%. And I am unable to understand why is 2% multiplied by $60. There should be no need to multiply right since 0.02 is the standard deviation that we already found.
Can we write more like
60 +/- 1.96* (0.02/sq rt of n)
Would be grateful if someone can clarify this doubt!
Thanks