The incoherence of risk coherence

RiskNoob

Active Member
Yes an interesting post indeed.

VaR, which is (still) the mostly used risk measure for various purposes, would have been replaced already by an alternative (coherent) measure such as ES if coherency adds an definite advantage for all scenarios.

RiskNoob
 

David Harper CFA FRM

David Harper CFA FRM
Subscriber
Very interesting, thanks Aleks!

I think it's firstly interesting because of how difficult it can be to grasp the meaning of the coherence rules. For example, I actually think his before-update description of monotonicity is more accurate than his revision; i.e., essentially correct for Monotonicity is "If the [my add: expected future] value of portfolio X is always bigger than the value of portfolio Y, then the risk of X is less than or equal to the risk of Y." Monotonicity is apparently tricky enough that even Wilmott, in my opinion, has it mis-stated in 22.9 (Coherence). Dowd has it correctly as: if Y > X, then rho(Y) < rho (X). That is, if the expected future value of Y is greater than X, then Y is less risky as we need to add less cash to Y to make its risk acceptable.

I don't see his point, even updated, against homogeneity. Which is the idea (I think) that, under an assumption of no friction (liquidity is excluded), risk scales with leverage. As Wilmott says, "double the position, double the risk"

Most importantly, where is his case against sub-additivity, really? I don't see it ... Dowd gives a few reasons why sub-additivity matters in practice, where I think the most important is "But if risks are not subadditive, adding them together gives us an underestimate of combined risks, which makes the sum of risks treacherous and therefore effectively useless as a back-of-the-envelope measure." In practice, any time we are aggregating, this could be a big deal.
 

Aleksander Hansen

Well-Known Member
Very interesting, thanks Aleks!

I think it's firstly interesting because of how difficult it can be to grasp the meaning of the coherence rules. For example, I actually think his before-update description of monotonicity is more accurate than his revision; i.e., essentially correct for Monotonicity is "If the [my add: expected future] value of portfolio X is always bigger than the value of portfolio Y, then the risk of X is less than or equal to the risk of Y." Monotonicity is apparently tricky enough that even Wilmott, in my opinion, has it mis-stated in 22.9 (Coherence). Dowd has it correctly as: if Y > X, then rho(Y) < rho (X). That is, if the expected future value of Y is greater than X, then Y is less risky as we need to add less cash to Y to make its risk acceptable.

I don't see his point, even updated, against homogeneity. Which is the idea (I think) that, under an assumption of no friction (liquidity is excluded), risk scales with leverage. As Wilmott says, "double the position, double the risk"

Most importantly, where is his case against sub-additivity, really? I don't see it ... Dowd gives a few reasons why sub-additivity matters in practice, where I think the most important is "But if risks are not subadditive, adding them together gives us an underestimate of combined risks, which makes the sum of risks treacherous and therefore effectively useless as a back-of-the-envelope measure." In practice, any time we are aggregating, this could be a big deal.

I see your last point about sub-additivity, however, when it comes to homogeneity, I think the assumption of no frictions renders it uselelss. Double the position, more than double the risk in real life.
 

David Harper CFA FRM

David Harper CFA FRM
Subscriber
But if that is your criteria, you'd have to deem useless the majority of basic finance models-yes? e.g., BSM, futures COC, SLB CAPM. Nevermind risk and performance measures.

The premise of the risk measure, rho(.), is that it evaluates "how much cash would need to be added to the position to make the risk acceptable?" Against that premise, linear factor in homogeneity seems to me the least problematic of the criteria. The assumption of marketability is par for the course; even as clearly under homogeneity the obvious problem is endogenous liquidity. But I don't know why coherence criteria should bear this burden of realism. As Portfolio Probe concedes, "It seems hard to make a general purpose risk measure that violates this axiom [i.e., he refers here to homogeneity]— but perhaps it is worth a try."
 

Aleksander Hansen

Well-Known Member
I agree, a lot of models would not work under that assumption, however, BSM is not one of them for most ATM options. Same with Black's model. Often a tradeoff between precision in pricing and window of opportunity to execute a trade.
 

Aleksander Hansen

Well-Known Member
In real life though I have never once had diversified VaR be less than the sum of the undiversified VaR. It depends on your portfolio of course; the most important thing is that you have a deep intuition about how the metrics apply to your portfolio so as not to get fooled by any one number.

Building financial models that work is hard since the laws of physics don't apply. One has to embrace the imperfections but in academia deviating from the microeconomic description of rationality is generally frowned upon, and the behavioral finance guys are no better. If there's a whole in their model, they just find some data that fits the story and plug it in retroactively. When talking to Derman before the release of his excellent book Models.behaving.badly he made fun of the fact that the "Fundamental theory of asset pricing" is 10 sentences long, whilst the fundamental theorem of algebra is two lines long. :)
 
Top