Andy Haldane once again. I mean how he keeps coming up with one terrific speech after the other. In his latest speech/paper Haldane looks at the history if normal distribution. He goes into historical details of how the distribution came up, how and why it was given the name normal, how it moved from sciences to social sciences and finally into economics. And he links all this in a typical Haldane style connecting finance/economics to many things in nature.
Just like other papers this one is so good that it is difficult to summarise the thoughts. At a very broad level the idea is not to be blinded by the beauty of the normal distribution curve. This has been suggested by many other econs particularly Taleb.
Infact, there is very little in economics and finance that resembles normal distribution with thin tails. What we have mostly is fat tails which make the probability of extreme events much higher. Once finance starts using fat tails, we realise we were underestimating the costs of many products and overall costs and capital ratios should be much higher.
For almost a century, the world of economics and finance has been dominated by randomness. Much of modern economic theory describes behaviour by a random walk, whether financial behaviour such as asset prices (Cochrane (2001)) or economic behaviour such as consumption (Hall (1978)). Much of modern econometric theory is likewise underpinned by the assumption of randomness in variables and estimated error terms (Hayashi (2000)).
But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.
The normal distribution provides a beguilingly simple description of the world. Outcomes lie symmetrically around the mean, with a probability that steadily decays. It is well-known that repeated games of chance deliver random outcomes in line with this distribution: tosses of a fair coin, sampling of coloured balls from a jam-jar, bets on a lottery number, games of paper/scissors/stone. Or have you been fooled by randomness?
He points to this fascinating tale where Sotheby’s was fooled into randomness:
In 2005, Takashi Hashiyama faced a dilemma. As CEO of Japanese electronics corporation Maspro Denkoh, he was selling the company’s collection of Impressionist paintings, including pieces by Cézanne and van Gogh. But he was undecided between the two leading houses vying to host the auction, Christie’s and Sotheby’s. He left the decision to chance: the two houses would engage in a winner-takes-all game of paper/scissors/stone.
Recognising it as a game of chance, Sotheby’s randomly played “paper”. Christie’s took a different tack. They employed two strategic game-theorists – the 11-year old twin daughters of their international director Nicholas Maclean. The girls played “scissors”. This was no random choice. Knowing “stone” was the most obvious move, the girls expected their opponents to play “paper”. “Scissors” earned Christie’s millions of dollars in commission.
As the girls recognised, paper/scissors/stone is no game of chance. Played repeatedly, its outcomes are far from normal. That is why many hundreds of complex algorithms have been developed by nerds (who like to show off) over the past twenty years. They aim to capture regularities in strategic decision-making, just like the twins. It is why, since 2002, there has been an annual international world championship organised by the World Rock-Paper-Scissors Society.
He starts with discussing how normality came up in physical systems. Then to social systems and finally to econ systems:
Early models of the economic system developed by Classical economists were qualitative and deterministic. This followed the tradition in Newtonian physics of explaining the world using Classical deterministic laws. Jevons, Walras, Edgeworth and Pareto “transmuted the physics of energy into the social mechanics of utility” (Mirowski (1989)). But in the early part of the 20th century, physics was in the throes of its own intellectual revolution. The emergence of quantum physics suggested that even simple systems had an irreducible random element. In physical systems, Classical determinism was steadily replaced by statistical laws. The natural world was suddenly ruled by randomness.
Economics followed in these footsteps, shifting from models of Classical determinism to statistical laws. The foundations for this shift were laid by Evgeny Slutsky (1927) and Ragnar Frisch (1933). They divided the dynamics of the economy into two elements: an irregular random element or impulse and a regular systematic element or propagation mechanism. This impulse/propagation paradigm remains the centrepiece of macro-economics to this day.
At the core of the macro-economic models developed by the Cowles Foundation were two features: least-squares methods for estimation of the economy’s propagation mechanisms and normality of the random impulses to this system. Both would have been familiar to Gauss and Galton. These two elements were in principle separable – the least squares method does not in fact rely on any distributional assumption about errors. But in the years that followed, econometric estimation and normality became inseparable.
And to finance:
This line of thinking moved effortlessly from economics to finance. Harry Markowitz was a member of the Cowles Commission. In 1952, he wrote a paper which laid the foundations for modern portfolio theory (Markowitz (1952)). In line with his Cowles contemporaries, Markowitz assumed financial returns could be characterised by mean and variance alone – conveniently consistent with normality. That assumption was crucial, for from it followed Markowitz’s mean-variance optimal portfolio rule.
At around the same time, Kenneth Arrow and Gerard Debreu (1954) were developing the first genuinely general equilibrium economic model. In this Arrow-Debreu world, future states of the world were assumed to have knowable probabilities. Agents’ behaviour was also assumed to be known. The Arrow-Debreu model thereby allowed an explicit price to be put on risk, while ignoring uncertainty. Risky (Arrow) securities could now be priced with statistical precision. These contingent securities became the basic unit of today’s asset pricing models.
In the period since, the models of Markowitz and Arrow/Debreu, with embedded assumptions of normality, have dominated asset-pricing in economics and finance. In economics, the Arrow/Debreu equilibrium model is the intellectual antecedent of today’s real business cycle models, the dominant macro-economic framework for the past 20 years (for example, Kiyotaki (2011)). Typically, these models have Gaussian-distributed impulses powering a Quetelet-inspired representative agent.
In finance, the dominant pricing models are built on Markowitz mean-variance foundations and the Arrow-Debreu principle of quantifiable risk. They, too, are typically underpinned by normality. For example, the feted Black and Scholes (1973) options-pricing formula, itself borrowed from statistical physics, is firmly rooted in normality. So too are off-the-shelf models of credit risk, such as Vasicek (2002). Whether by accident or design, finance theorists and practitioners had by the end of the 20th century evolved into fully paid-up members of the Gaussian sect..
He then goes onto show how we see fat tails in different kinds of distributions. And how that creates problems of plenty. He also discusses ways for finance to get out of this normality obsession. I am not getting into details. Each example is a gem.
In the end:
Normality has been an accepted wisdom in economics and finance for a century or more. Yet in real-world systems, nothing could be less normal than normality. Tails should not be unexpected, for they are the rule. As the world becomes increasingly integrated – financially, economically, socially – interactions among the moving parts may make for potentially fatter tails. Catastrophe risk may be on the rise.
If public policy treats economic and financial systems as though they behave like a lottery – random, normal – then public policy risks itself becoming a lottery. Preventing public policy catastrophe requires that we better understand and plot the contours of systemic risk, fat tails and all. It also means putting in place robust fail-safes to stop chaos emerging, the sand pile collapsing, the forest fire spreading. Until then, normal service is unlikely to resume.
Very very good.
Haldane in top form post-crisis...