On the non-equivalence of Keynesian and Knightian uncertainty (wonkish)

5 February, 2013 at 22:30 | Posted in Economics, Theory of Science & Methodology | 2 Comments

Last year Bank of England’s Andrew G Haldane and Benjamin Nelson  presented a paper with the title Tails of the unexpected. The main message of the paper was that we should no let us be fooled by randomness:

For almost a century, the world of economics and finance has been dominated by randomness. Much of modern economic theory describes behaviour by a random walk, whether financial behaviour such as asset prices (Cochrane (2001)) or economic behaviour such as consumption (Hall (1978)). Much of modern econometric theory is likewise underpinned by the assumption of randomness in variables and estimated error terms (Hayashi (2000)).

But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.

The normal distribution provides a beguilingly simple description of the world. Outcomes lie symmetrically around the mean, with a probability that steadily decays. It is well-known that repeated games of chance deliver random outcomes in line with this distribution: tosses of a fair coin, sampling of coloured balls from a jam-jar, bets on a lottery number, games of paper/scissors/stone. Or have you been fooled by randomness?

In 2005, Takashi Hashiyama faced a dilemma. As CEO of Japanese electronics corporation Maspro Denkoh, he was selling the company’s collection of Impressionist paintings, including pieces by Cézanne and van Gogh. But he was undecided between the two leading houses vying to host the auction, Christie’s and Sotheby’s. He left the decision to chance: the two houses would engage in a winner-takes-all game of paper/scissors/stone.

Recognising it as a game of chance, Sotheby’s randomly played “paper”. Christie’s took a different tack. They employed two strategic game-theorists – the 11-year old twin daughters of their international director Nicholas Maclean. The girls played “scissors”. This was no random choice. Knowing “stone” was the most obvious move, the girls expected their opponents to play “paper”. “Scissors” earned Christie’s millions of dollars in commission.

As the girls recognised, paper/scissors/stone is no game of chance. Played repeatedly, its outcomes are far from normal. That is why many hundreds of complex algorithms have been developed by nerds (who like to show off) over the past twenty years. They aim to capture regularities in strategic decision-making, just like the twins. It is why, since 2002, there has been an annual international world championship organised by the World Rock-Paper-Scissors Society.

The interactions which generate non-normalities in children’s games repeat themselves in real world systems – natural, social, economic, financial. Where there is interaction, there is non-normality. But risks in real-world systems are no game. They can wreak havoc, from earthquakes and power outages, to depressions and financial crises. Failing to recognise those tail events – being fooled by randomness – risks catastrophic policy error.

So is economics and finance being fooled by randomness? And if so, how did that happen?

Normality has been an accepted wisdom in economics and finance for a century or more. Yet in real-world systems, nothing could be less normal than normality. Tails should not be unexpected, for they are the rule. As the world becomes increasingly integrated – financially, economically, socially – interactions among the moving parts may make for potentially fatter tails. Catastrophe risk may be on the rise.

If public policy treats economic and financial systems as though they behave like a lottery – random, normal – then public policy risks itself becoming a lottery. Preventing public policy catastrophe requires that we better understand and plot the contours of systemic risk, fat tails and all. It also means putting in place robust fail-safes to stop chaos emerging, the sand pile collapsing, the forest fire spreading. Until then, normal service is unlikely to resume.

Since I think this is a great paper, I think it merits  a couple of  comments.

To understand real world ”non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past are not those that will rule the future.

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

When you assume the economic processes to be ergodic, ensemble and time averages are identical. Let me give an example: Assume we have a market with an asset priced at 100 €. Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be 100 €- because we here envision two parallel universes (markets) where the asset-price falls in one universe (market) with 50% to 50 €, and in another universe (market) it goes up with 50% to 150 €, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset-price first rises by 50% to 150 €, and then falls by 50% to 75 € (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen.

Assuming ergodicity there would have been no difference at all. What is important with the fact that real social and economic processes are nonergodic is the fact that uncertainty – not risk – rules the roost. That was something both Keynes and Knight basically said in their 1921 books. Thinking about uncertainty in terms of “rational expectations” and “ensemble averages” has had seriously bad repercussions on the financial system.

Knight’s uncertainty concept has an epistemological founding and Keynes’s definitely an ontological founding. Of course this also has repercussions on the issue of ergodicity in a strict methodological and mathematical-statistical sense. I think Keynes’s view is the most warranted of the two.

The most interesting and far-reaching difference between the epistemological and the ontological view is that if one subscribes to the former, Knightian view – as Taleb, Haldane & Nelson and “Black Swan” theorists basically do – you open up for the mistaken belief that with better information and greater computer-power we somehow should always be able to calculate probabilities and describe the world as an ergodic universe. As Keynes convincingly argued, that is ontologically just not possible.

To Keynes the source of uncertainty was in the nature of the real – nonergodic – world. It had to do, not only – or primarily – with the epistemological fact of us not knowing the things that today are unknown, but rather with the much deeper and far-reaching ontological fact that there often is no firm basis on which we can form quantifiable probabilites and expectations at all.

About these ads

2 Comments »

RSS feed for comments on this post. TrackBack URI

  1. It is interesting to speculate what Keynes would have thought of economics and chaos theory because his perception of uncertainty would certainly fit a non-linear dynamic system capable of chaotic behavior, such as weather, with the ability to project future events barred by sensitivity to initial conditions.

    For a comparison of economic and weather forecasting, please see ‘Cloudy with a Chance of Default’ at http://somewhatlogically.com/?p=785 I also strongly recommend George Dyson’s excellent book, ‘Turing’s Cathedral’, which is a fascinating history of the development of the digital computer.

    It is quite amazing the manner in which von Neumann and his colleagues ran simulations of shockwaves for nuclear explosions, began the first weather forecasting simulations, and simulated evolution at the genetic level, all on the 50′s era Institute for Advanced Studies binary computer with with a 40 bit word, storing two 20 bit instructions in each word. The memory was 1024 words (5.1 kilobytes).

  2. Your example with with stock prices doesn’t make much sense. In a geometric random-walk model (which is what I assume you’re describing), prices are not stationary, so naturally they cannot be ergodic. However, returns are iid over time, which is the clearest example of ergodic time series you could get. Time average of such returns will be the same as their ensemble average.

    Regarding expectation, nonergodicity, etc. – sure, the world is a complicated place. Yet people must make their decisions somehow, and for that, they must somehow form subjective beliefs about likelihoods of future events, even if they cannot estimate them from past data. If we reject the formalism of probability theory, what’s the alternative way to analyze how people decide under uncertainty?


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.