EWMA model returns


I am reviewing the EWMA model section and found returns were calculated on straight (Pt+1/Pt)-1, shouldn't the returns be calculated on log basis? Also what's the assumption for the test per se?
Additionally, under Hull's chapter, why would more weights be assigned to recent observations than older? Weight is declining as more recent observations are being added though, changing λ from 0.95 to 0.85 shouldn't allocate more weights, should be less right? Please guide me as I probably didn't understand completely.

Question 10.10

A company uses an EWMA model for forecasting volatility. It decides to change the parameter λ from 0.95 to 0.85. Explain the likely impact on the forecasts.


Reducing λ from 0.95 to 0.85 means that more weight is given to recent observations of and less weight is given to older observations. Volatilities calculated with = 0.85 will react more quickly to new information and will bounce around much more than volatilities calculated with
= 0 .9 5 .

David Harper CFA FRM

David Harper CFA FRM
Staff member
Hi @ziminli1228 Yes, indeed, use of log returns is arguably more consistent (and would be fine). But Hull explains (in both textbooks I think) that the simple (aka, arithmetic) return is approximating the log return. For one day (daily) periods, the difference tends to be negligible. It does matters for longer periods (per Kevin Dowds points). So, I do not believe we can say that either should be used. It is an issue of distributional assumption, and as practical matter, log returns can be added over time, but simple returns can be summed across the portfolio. Re: What's the assumption for the test per se? The exam must state the assumption. This is the definition that we have most given GARP feedback on, over the past decade (literally dozens of examples have been used). They are well aware that this assumptions need to be given. Actually this and the related issue of compound frequency, which is an assumption that must be given.

In regard to Hull's question 10.10, EWMA volatility is a recursive formula that is the solution to a series in which the weights (assigned to squared returns) are declining. If λ = 0.95, then in reverse chronological order, the weights are
  • 1 day ago (the previous day) the weight is (1 - 0.95) = 5.0%
  • 2 days ago = 5%*0.95 = 4.75%
  • 3 days ago = 4.75%*0.95 = 4.513%; or using the direct formula for the weight that can be used for any prior (t - n) day: (1-λ)*λ^(n-1) = (1-0.95)*0.95^2 = 4.513%
  • etc ... notice that consecutive weights have a ratio = λ.
If we dial down λ to 0.85, then the weights are:
  • 1 day ago = 15.0%
  • 2 days ago = 15%*0.85
  • 3 days ago (directly) = .15*.85^2 = 10.84%
Comparing the two, λ = 0.85 is assigning greater weights to the more recent returns. This is the point of a lower "smoother parameter:" it is more reactive to the newest returns and less persistently clinging to older returns. Effectively the lower lambda captures a smaller window; e.g., at λ = 0.85, over half the weight is assigned in only the five recent returns, but λ = 0.95 does get to half the weight until the 14th return.

(minor note: Hull's use of "forecast" is confusing here. unlike GARCH, EWMA does not really forecast. I think better here is "updating volatility" and "impact on updates"). I hope that helps,