P2 Focus Review 2nd of 8 (Market Risk)

David Harper CFA FRM

David Harper CFA FRM
Subscriber
P2 Focus Review, 2nd of 8 (Market Risk): Videos, Practice Questions and Learning Spreadsheets
  • Associated learning spreadsheets are:
    • 2012.T5.c.1. VaR Backtest Jorion
    • 2012.T5.c.2. VaR Mapping Jorion
    • 2012.T5.d.1. Expected Shortfall
    • 2012.T5.d.2. Guassian Copula
Concepts:
  • Value at Risk (VaR)
  • Backtesting
  • Mapping
  • Risk Metrics (including non-parametric HS VaR) and dependence (Dowd)
  • Extreme Value (EVT)
VaR Backtest
You should understand the mechanics of the backtest, which is a hypothesis test of an observed mean (number of exceptions, where each day the VaR can either be exceeded or not) of a variable which has a binomial distribution (a series of i.i.d. Bernoullis is a binomial)

VaR Mapping
Historically, the exam has tested shallow this deep topic. As Jorion uses bond mapping as the primary use-case, I would be familiar with duration versus cash flow mapping. Next, a qualitative understanding *might* be sufficient. Notice the examples mostly start with a valuation (pricing) model and uses the key pricing variables as risk factors; e.g., the risk factors in a FX futures contracts are the same we'd need to value it.

Risk Metrics (including non-parametric HS VaR) and Dependence
I did not go deep on non-parametric methods, because the exam is unlikely to go beyond the superficial with respect to HS variants (frankly). Don't get me wrong, I'd be generally familiar with why they seek to overcome the limitations of an equally-weighted naive HS. But more important are:
  • Facility with VaR calculation. Please notice Dowd has a lognormal VaR.
  • Coherence, and in particular: VaR (a quantile) is not-subadditive so is not coherent. But ES (a conditional average) is both.
  • The acute limitations of linear correlation and, conceptually, why copulas overcome. At what cost? (of course, you won't be asked to calculate a copula). Please make sure you understand why lack of correlation does not imply independence, although independence does imply zero correlation.
EVT
I never know if/what the FRM will test here; historically, EVT is way over-assigned. I regret AIMs like "Compute VaR and expected shortfall using the POT approach, given various parameter values" because I doubt they will be asked. In my opinion, your essentials are:
  • Why: Why do we need EVT? Why can't we use HS in the extreme tail?
  • Understand that CLT does not apply, and that the two approaches (POT and block maxima) imply two tail-only distributions (GPD and GEV) as a function of an extreme i.i.d. dataset; e.g., GPD is the distribution, in the limit, implied by characterizing a dataset of losses above some threshold.
  • In regard to the formulas, candidly: I'd save them for later, and study them as time permits. And, then, I'd probably only work with POT formula. So far, going deep on these highly technical formulas has not really paid off w.r.t. the exam. There is a good chance you will only need a conceptual understanding; e.g., what is the difference between block maxima and POT approaches?
 
Top