Randomness, fat tails and ergodicity – a Keynesian perspective on Knightian uncertainty

9 Jun, 2012 at 16:32 | Posted in Economics, Statistics & Econometrics | 4 Comments

Bank of England’s Andrew G Haldane and Benjamin Nelson yesterday presented a paper with the title Tails of the unexpected at “The Credit Crisis Five Years On: Unpacking the Crisis”-conference held at the University of Edinburgh Business School.

The main message of the paper was that we should no let us be fooled by randomness:

For almost a century, the world of economics and finance has been dominated by randomness. Much of modern economic theory describes behaviour by a random walk, whether financial behaviour such as asset prices (Cochrane (2001)) or economic behaviour such as consumption (Hall (1978)). Much of modern econometric theory is likewise underpinned by the assumption of randomness in variables and estimated error terms (Hayashi (2000)).

But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.

The normal distribution provides a beguilingly simple description of the world. Outcomes lie symmetrically around the mean, with a probability that steadily decays. It is well-known that repeated games of chance deliver random outcomes in line with this distribution: tosses of a fair coin, sampling of coloured balls from a jam-jar, bets on a lottery number, games of paper/scissors/stone. Or have you been fooled by randomness?

In 2005, Takashi Hashiyama faced a dilemma. As CEO of Japanese electronics corporation Maspro Denkoh, he was selling the company’s collection of Impressionist paintings, including pieces by Cézanne and van Gogh. But he was undecided between the two leading houses vying to host the auction, Christie’s and Sotheby’s. He left the decision to chance: the two houses would engage in a winner-takes-all game of paper/scissors/stone.

Recognising it as a game of chance, Sotheby’s randomly played “paper”. Christie’s took a different tack. They employed two strategic game-theorists – the 11-year old twin daughters of their international director Nicholas Maclean. The girls played “scissors”. This was no random choice. Knowing “stone” was the most obvious move, the girls expected their opponents to play “paper”. “Scissors” earned Christie’s millions of dollars in commission.

As the girls recognised, paper/scissors/stone is no game of chance. Played repeatedly, its outcomes are far from normal. That is why many hundreds of complex algorithms have been developed by nerds (who like to show off) over the past twenty years. They aim to capture regularities in strategic decision-making, just like the twins. It is why, since 2002, there has been an annual international world championship organised by the World Rock-Paper-Scissors Society.

The interactions which generate non-normalities in children’s games repeat themselves in real world systems – natural, social, economic, financial. Where there is interaction, there is non-normality. But risks in real-world systems are no game. They can wreak havoc, from earthquakes and power outages, to depressions and financial crises. Failing to recognise those tail events – being fooled by randomness – risks catastrophic policy error.

So is economics and finance being fooled by randomness? And if so, how did that happen?

Normality has been an accepted wisdom in economics and finance for a century or more. Yet in real-world systems, nothing could be less normal than normality. Tails should not be unexpected, for they are the rule. As the world becomes increasingly integrated – financially, economically, socially – interactions among the moving parts may make for potentially fatter tails. Catastrophe risk may be on the rise.

If public policy treats economic and financial systems as though they behave like a lottery – random, normal – then public policy risks itself becoming a lottery. Preventing public policy catastrophe requires that we better understand and plot the contours of systemic risk, fat tails and all. It also means putting in place robust fail-safes to stop chaos emerging, the sand pile collapsing, the forest fire spreading. Until then, normal service is unlikely to resume.

Since I think this is a great paper, I think it merits  a couple of  comments.

To understand real world ”non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past are not those that will rule the future.

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

When you assume the economic processes to be ergodic, ensemble and time averages are identical. Let me give an example: Assume we have a market with an asset priced at 100 €. Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be 100 €- because we here envision two parallel universes (markets) where the asset-price falls in one universe (market) with 50% to 50 €, and in another universe (market) it goes up with 50% to 150 €, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset-price first rises by 50% to 150 €, and then falls by 50% to 75 € (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen.

Assuming ergodicity there would have been no difference at all. What is important with the fact that real social and economic processes are nonergodic is the fact that uncertainty – not risk – rules the roost. That was something both Keynes and Knight basically said in their 1921 books. Thinking about uncertainty in terms of “rational expectations” and “ensemble averages” has had seriously bad repercussions on the financial system.

Knight’s uncertainty concept has an epistemological founding and Keynes’s definitely an ontological founding. Of course this also has repercussions on the issue of ergodicity in a strict methodological and mathematical-statistical sense. I think Keynes’s view is the most warranted of the two.

The most interesting and far-reaching difference between the epistemological and the ontological view is that if one subscribes to the former, knightian view – as Taleb, Haldane & Nelson and “black swan” theorists basically do – you open up for the mistaken belief that with better information and greater computer-power we somehow should always be able to calculate probabilities and describe the world as an ergodic universe. As Keynes convincingly argued, that is ontologically just not possible.

To Keynes the source of uncertainty was in the nature of the real – nonergodic – world. It had to do, not only – or primarily – with the epistemological fact of us not knowing the things that today are unknown, but rather with the much deeper and far-reaching ontological fact that there often is no firm basis on which we can form quantifiable probabilites and expectations at all.

4 Comments

  1. Very good piece. I wrote a response here which I think provocative:

    Born Blind: Lars Syll, Uncertainty and the Question of Truth Versus Relativism

    • Nice blogpost indeed, Philip.

      I especially like your rephrased characterization of the Knight/Keynes divide:

      “The Knightian universe is one in which the possibility for knowledge of truly uncertain entities potentially exists, while the Keynesian universe is one in which the possibility for knowledge of truly uncertain entities is a priori ruled out. The Knightian perspective implicitly postulates that somewhere out there is a place or an entity that contains such knowledge (one might point out that this is somewhat similar to a concept of an omnipotent God) while the Keynesian perspective postulates that this place or entity either does not exist or is ontologically inaccessible to us mortals.”

      Although I have no objections at all to the first part of your post, I’m more dubious to the second part. I’m not convinced that the Hegelian/Rationalist dichotomy is the most elucidating one for understanding Keynes’s thought on economics and probability. As I have argued in my book “John Maynard Keynes” (SNS 2007 – in Swedish only unfortunately …) Keynes was in many regards an economist with close affinities to critical realist views (as were arguably also Marx).

      Even if I consider myself a sort-a-Chapter-12-Keynesian, I think one also has to remember than Keynes’s insistence on conventions, norms, animal spirits and systems of beliefs, when it comes to investment, was not meant as a substitute for calculations based on anticipations and profit expectations, but rather an important complement (as a rule sidestepped by mainstream economics).

      • Lars, I would be interested in reading your book. Unfortunately, I do not speak Swedish!

        A few comments though. While the Keynes question might be up in the air, I think that there’s a strong indication that Marx was a social constructivist. He was quite manifestly also a Hegelian (although I don’t think that he was a very good one). Nevertheless, he did seem to take the view that people were “determined” by the overarching social structures of which they form a part.

        I should say from the outset that although I believe this is broadly the correct approach, I think Marx and those who followed him were too radical in their belief that humans are infinitely mutable. I think there’s a great deal of misunderstanding here in the current debates over so-called postmodernism. Radicals, like Althusser and Deleuze, followed Marx on this in what i would argue to be a sort of total social constructivist viewpoint. Non-radicals however, like for example Lacan and Levi-Strauss, did not. You find a lot of biological determinism in Lacan and Levi-Strauss as well as Universals like the incest taboo. I think that the radicals have come to represent postmodernism for many, while there is another current of this movement entirely.

        Regarding the anticipations etc. this all comes back to something like the investment multiplier (accelerator). The investment multiplier gives us a fantastic explanation for why firms are investing — until it doesn’t. Taken on its own the investment multiplier should extend to infinity, but it never does. There is always some external “event” and this is usually tied to expectations. It seems to me — as someone who does empirical research and follows the financial markets — that expectations are by far the most important determinate of economic activity.

  2. […] I am still making my way through them I have come across one in particular that I think raises a very interesting issue, namely the difference between Keynesian and Knightian […]


Sorry, the comment form is closed at this time.

Blog at WordPress.com.
Entries and Comments feeds.