DRIFT MODEL: Underlying Logic of Formula for Random Simulated Process

ABSMOGHE

New Member
Hi David,
I am currently studying Tuckman, Art of TSM : Drift chapter in the Part 2 Syllabus.
While taking a look at the spreadsheet you have prepared, I happened to come across the formula for dw in your random simulated process for MODEL 1. The formula for the same was =NORM.S.INV(RAND())*SQRT($D$24) . As I understand, what this formula does is it picks a random number as probability and finds its z value. However, I do not understand the underlying logic for picking up the Probability as a Random number. Would it not be acceptable to place the formula as =RANDBETWEEN(-2.33,2.33)*SQRT($D$24) ?
Since all we need is a random number for standard deviations around the drift, why do we need to find a random probability and then its z value, instead of directly randomising standard deviations.

Also just to clarify, the reason we take dt as \[ \sigma\;*\;\sqrt t\; \] in the rate tree formula is because we assume dw=1 ie. Standard deviations around the mean to be 1, right ?


I apologise if the question was a bit stupid. Appreciate your answer on this.
Please find a screencapture of the spreadsheet attached below.

Temp.png
 

ABSMOGHE

New Member
Hi.
I wanted to ask why Square Root of dt is multiplied to Standard Deviation in drift models.
I was under impression the reason for it was the same principle guiding the calculation of Var for n days. ie. Variance is directly proportional to "n". Hence σ would be proportional to square root of days.
However if that principle held true, then the formula for drift after 2 months would be
\[ dr\;=\frac2{12}\;\times\;(\lambda)\;+\;\sigma\;\times\;\sqrt{\frac2{12}} \]

However in actuality, following is the case
\[ dr\;=\;\frac2{12}\;\times\;(\lambda)\;+\;2\;\times\;\sigma\;\times\;\sqrt{\frac1{12}} \]

Hence I wanted to inquire what actually was the case ?
Awaiting your response. Thanks :)
 

David Harper CFA FRM

David Harper CFA FRM
Subscriber
Hi @ABSMOGHE Sure, it's not a stupid question at all. If we use =RANDBETWEEN(-2.33, 2.33) then we will get the outcome of a random continuous uniform distribution between -2.33 and 2.33; for example, a quantile near zero is just as likely as a value near 2.33 (!). But the normal is bell-shaped: outliers should be less likely. Also, btw, this function will never return a value like 2.50 which is beyond the interval, yet the normal distribution can return extreme values albeit unlikely (the normal is asymptotic to X axis such that any values are possible).

Instead, the function = NORM.S.INV(RAND()) performs an inverse transformation. The RAND() returns a continuous uniform probability from 0 to 1.0; this represents the cumulative probability distribution function (aka, CDF). Effectively, we first randomize the area under the probability distribution curve because it's an "easy" random [0,1]. Then we infer the associate normal quantile with NORM.S.INV(probability). But we could assume a different distribution; e.g., =LOGNORM.INV(...).

With respect to σ*SQRT(T), the square root of Δt is applying the square root rule (aka, SRR; variance scales with time such that volatility scales with the square root of time). As a general rule, input assumptions are given in per annum terms, including volatility, so the input assumption of 1.60% per annum needs to be translated (scaled) into a monthly volatility with 1.60% * sqrt(1/12). This is a bit confusing because Tuckman actually scales the standard random normal variable first, so that his dw is random time-scaled normal variable. But I explain this further here at https://forum.bionicturtle.com/thre...-dw-by-sqrt-1-12-every-month.13745/post-59068 i.e.,
Hi @QuantFFM dw is how Tuckman specifies the models; he scales the random standard normal rather than scaling the annual basis point volatility input (which would be more intuitive to me, too!). But I'm not sure it matters because the essential random shock is the product of three variables:

[random normal Z = N^(-1)(random p)] * σ[annual basis point volatility] * sqrt(Δt/12_months); i.e.,
  • Random normal Z: a random standard normal, by definition µ = 0, σ = 1.0
  • The annual basis point volatility; e.g., 1.60% or 160 basis points per annum. As usual, inputs should be in per annum terms
  • Scaling factor per the usual square root rule (SRR) that assumes i.i.d. Notice I elaborated the full SRR to sqrt(Δt/12_months) because the denominator is whatever are the time dimension of the volatility input, in the case and as usual, per annum = 12 months. So to your second point, of course the 1-year volatility input can be scaled to monthly with 1.60% * SQRT(1 month/12 months) when our tree step is one month (i.e., numerator) and our input volatility is 12 months (i.e., denominator). Because we can assume this re-scaled monthly volatility is normal, it randomized by multiplying by a random normal Z.
It's just the case that models are specified by (random normal Z) * σ[annual basis point volatility] * sqrt(Δt) = [(random normal Z) *sqrt(Δt)] * σ[annual basis point volatility] = dw * σ[annual basis point volatility], so that dw is not random standard normal but instead a random normal, with standard deviation of sqrt(Δt), that scales (i.e., is a multiplier on) the annual basis point volatility. To me it's not different than itemizing all three with Z*σ*sqrt(Δt), and if he gains an advantage by using (dw) I don't really know what it is?! To your point, I don't know why his is better than: (random normal Z) * σ[annual basis point volatility] * sqrt(Δt) = (random normal Z) *[sqrt(Δt) * σ(annual basis point volatility)] = Z * σ[t-period basis point volatility]

I hope that's helpful, have a good week yourself!

I hope that's helpful!
 

ABSMOGHE

New Member
Hi @ABSMOGHE Sure, it's not a stupid question at all. If we use =RANDBETWEEN(-2.33, 2.33) then we will get the outcome of a random continuous uniform distribution between -2.33 and 2.33; for example, a quantile near zero is just as likely as a value near 2.33 (!). But the normal is bell-shaped: outliers should be less likely. Also, btw, this function will never return a value like 2.50 which is beyond the interval, yet the normal distribution can return extreme values albeit unlikely (the normal is asymptotic to X axis such that any values are possible).

Instead, the function = NORM.S.INV(RAND()) performs an inverse transformation. The RAND() returns a continuous uniform probability from 0 to 1.0; this represents the cumulative probability distribution function (aka, CDF). Effectively, we first randomize the area under the probability distribution curve because it's an "easy" random [0,1]. Then we infer the associate normal quantile with NORM.S.INV(probability). But we could assume a different distribution; e.g., =LOGNORM.INV(...).

With respect to σ*SQRT(T), the square root of Δt is applying the square root rule (aka, SRR; variance scales with time such that volatility scales with the square root of time). As a general rule, input assumptions are given in per annum terms, including volatility, so the input assumption of 1.60% per annum needs to be translated (scaled) into a monthly volatility with 1.60% * sqrt(1/12). This is a bit confusing because Tuckman actually scales the standard random normal variable first, so that his dw is random time-scaled normal variable. But I explain this further here at https://forum.bionicturtle.com/thre...-dw-by-sqrt-1-12-every-month.13745/post-59068 i.e.,


I hope that's helpful!

Hi David,
Thanks a lot for your reply. That was extremely helpful. It seems I had a lucky guess with the reason for multiplying with the square root of t. ie. Variance is directly proportional to "n" and not standard deviation as confirmed by you.
However if that's the case, shouldn't the formula for (say for eg.) 2 months from now be

\[ dr\;=\frac2{12}\;\times\;(\lambda)\;+\;\sigma\;\times\;\sqrt{\frac2{12}} \]

Instead of

\[ dr\;=\;\frac2{12}\;\times\;(\lambda)\;+\;2\;\times\;\sigma\;\times\;\sqrt{\frac1{12}} \]


The reason I feel this is because if we were calculating VaR then given a month's Var the Var for 2 months would amount to
\[ VaR_{Monthly}\;\times\;\sqrt2 \]

Really Appreciate your reply :)
 

David Harper CFA FRM

David Harper CFA FRM
Subscriber
Sure no problem @ABSMOGHE Yes you are correct about scaling volatility/VaR under i.i.d. but please note that the monthly simulation (the lower panel) in your screenshot above is a month-by-month simulation (i.e., one month at a time). It could be altered to simulate in two month intervals (where each step is two months), in which case, I do agree with you that we would scale per the square root rule with σ*sqrt(2/12).

However, this is not to be confused with the interest rate tree. Each rate model in Tuckman has both:
  • an illustrated rate tree that shows a sequence of possible one sigma jumps (up or down); and
  • a model for simulating the rate over T-periods by randomizing the rate change, dr
For example, assuming the σ = 160 bps per above, and assuming drift is zero to simply. The one-year rate tree with twelve monthly steps finishes its highest node, r[date = 12, state = 12], at r(0) + 12*[0.0160*sqrt*(1/12)] = r(0) + 5.543%. Yet the volatility is 1.60%, how can this be? It is because the 5.543% is an extremely unlikely outcome, it is much further from the mean than one sigma! The rate map is merely illustrating step-by-step possibilities. The square root rule (SRR) very importantly assumes i.i.d.; i.e., independence. You correctly show a 2-month VaR = sqrt(2) * 1-month VaR, but this (i) assumes independence and (ii) it is two-period measure. The rate tree displays no awareness that a month-by-month simulation (aka, a path thru the tree) is very unlikely to go to either extreme, due to the independence, and more likely to end up nearer to r(0). (But there is a further nuance which can be confusing, each value is associated with a period. The rate tree is jumping one month at a time; the scaled volatility/VaR is an n-month volatility/VaR. The 1.60% is associated with an annual period; but it is not clear to me that we can even call the 5.543% an annual value). I hope that's helpful!
 
Last edited:

ABSMOGHE

New Member
Sure no problem @ABSMOGHE Yes you are correct about scaling volatility/VaR under i.i.d. but please note that the monthly simulation (the lower panel) in your screenshot above is a month-by-month simulation (i.e., one month at a time). It could be altered to simulate in two month intervals (where each step is two months), in which case, I do agree with you that we would scale per the square root rule with σ*sqrt(2/12).

However, this is not to be confused with the interest rate tree. Each rate model in Tuckman has both:
  • an illustrated rate tree that shows a sequence of possible one sigma jumps (up or down); and
  • a model for simulating the rate over T-periods by randomizing the rate change, dr
For example, assuming the σ = 160 bps per above, and assuming drift is zero to simply. The one-year rate tree with twelve monthly steps finishes its highest node, r[date = 12, state = 12], at r(0) + 12*[0.0160*sqrt*(1/12)] = r(0) + 5.543%. Yet the volatility is 1.60%, how can this be? It is because the 5.543% is an extremely unlikely outcome, it is much further from the mean than one sigma! The rate map is merely illustrating step-by-step possibilities. The square root rule (SRR) very importantly assumes i.i.d.; i.e., independence. You correctly show a 2-month VaR = sqrt(2) * 1-month VaR, but this (i) assumes independence and (ii) it is two-period measure. The rate tree displays no awareness that a month-by-month simulation (aka, a path thru the tree) is very unlikely to go to either extreme, due to the independence, and more likely to end up nearer to r(0). (But there is a further nuance which can be confusing, each value is associated with a period. The rate tree is jumping one month at a time; the scaled volatility/VaR is an n-month volatility/VaR. The 1.60% is associated with an annual period; but it is not clear to me that we can even call the 5.543% an annual value). I hope that's helpful!


Now it's crystal clear. Thanks a lot David, your reply was really helpful and I appreciate it. Have a good day.
 
Top