We use cookies to make this site as useful as possible. Read our cookie policy or allow cookies.

Model failures

14 December 2015

Model failures - read more

Models always fail at inflexion points, says Anthony Hilton

An investment banker who lost his job following the 2008 crash said that he thought his sacking had been justified because he was the pilot at the controls when his bank flew into the mountain. He blamed no one else, but he did have an excuse. He was flying by his instruments.

He was relying on the bank’s value-at-risk models to control the trading desks’ market exposure. Unfortunately the models proved to be inadequate when the storm broke. They failed to warn him of the potential exposures and the losses wiped out the bank’s equity. Were it not for a government bailout, it would have failed.

A few months later the then chief financial officer of another investment bank was testifying to a US Senate Committee which was looking into the causes of the financial crisis. Perhaps sensing that his audience of politicians would not fully grasp the implications of multi-layered management speak, he lapsed into the jargon of six sigma variations – the language of statistics and probability – to explain how his bank had lost money. It refers to the convention that if you start with a bell curve representing a standard distribution of risk then one standard deviation from that model is represented by one sigma. That is rare enough but six sigma is six such standard deviations and is a probability so small it is almost impossible to imagine.

A six sigma deviation is an event you would expect to happen once in a billion years. Yet according to the finance director during the crisis the bank had such an event not once but on four successive days. The implication was that the bank could not have foreseen something like this happening. Ergo it was blameless.

There is another explanation which he did not put forward – that his model was wrong when it said something was likely to occur only once in a billion years. Perhaps all the banks’ officers were at fault after all for relying too much on their models and not thinking for themselves.

Richard Sharp, a former banker and now an external member of the Bank of England’s Financial Policy Committee, appears to share some of these concerns. In October he spoke about some of the things which were worrying his stability committee – that the world is more in debt now than it was at the time of the crash and that there appears to be little liquidity in financial markets, which makes them prone to disorderly price swings. His major departure from the usual litany of worries was a comment that the financial world is over reliant on models and has too much faith in their ability.

Those who design and construct mathematical models do not seem deterred by what is unknown. This tends to result in models that are more confident in the universal validity of the outcomes than is prudent.

All models rely on historic patterns of data and trends. The algorithmic models, which drive smart beta strategies of pension funds and much of the share trading carried out by hedge funds and similar investors, are all about the continuation of patterns established yesterday and today – none look to the future. Therefore they are very good at trend following and they can make money for many years as long as the trend remains intact.

They are however hopeless at inflexion points when the market or the economic indicators fundamentally change direction. This is why they are so mesmerised by the Federal Reserve and its eventual decision to raise interest rates which could precipitate just such a shift. Models always fail at inflexion points and the losses then incurred will most likely wipe out years of gains.

Strategies that have looked good for five years of low interest rates could implode if rates rise faster than expected, currencies change direction or inflation returns. At this point the nation’s pension funds – and no doubt others – will discover that when strategies underperform, they underperform by a lot. That is a timely reminder that mathematical models do not ‘do’ near misses – which is why those who rely too much on them end up flying into the mountain.

Have your say

comments powered by Disqus

Advertisements


ICSA: The Governance Institute
Saffron House, 6-10 Kirby Street, London EC1N 8TS, United Kingdom