## Probabilistic reductionism

8 September, 2013 at 10:30 | Posted in Theory of Science & Methodology | 3 CommentsProbabilistic reasoning in science – especially Bayesianism – reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but – even granted this questionable reductionism – it’s not self-evident that rational agents really have to be probabilistically consistent. There is no strong warrant for believing so. Rather, there are strong evidence for us encountering huge problems if we let probabilistic reasoning become the dominant method for doing research in social sciences on problems that involve risk and uncertainty.

In many of the situations that are relevant to economics one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind, and that in those situations it is not really possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.

Say you have come to learn (based on own experience and tons of data) that the probability of you becoming unemployed in Sweden is 10%. Having moved to another country (where you have no own experience and no data) you have no information on unemployment and a fortiori nothing to help you construct any probability estimate on. A Bayesian would, however, argue that you would have to assign probabilities to the mutually exclusive alternative outcomes and that these have to add up to 1, if you are rational. That is, in this case – and based on symmetry – a rational individual would have to assign probability 10% to becoming unemployed and 90% of becoming employed.

That feels intuitively wrong though, and I guess most people would agree. Bayesianism cannot distinguish between symmetry-based probabilities from information and symmetry-based probabilities from an absence of information. In these kinds of situations most of us would rather say that it is simply irrational to be a Bayesian and better instead to admit that we “simply do not know” or that we feel ambiguous and undecided. Arbitrary and ungrounded probability claims are more irrational than being undecided in face of genuine uncertainty, so if there is not sufficient information to ground a probability distribution it is better to acknowledge that simpliciter, rather than pretending to possess a certitude that we simply do not possess.

I think this critique of Bayesianism is in accordance with the views of **Keynes**’s *A Treatise on Probability* (1921) and *General Theory* (1937). According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but rational expectations. Sometimes we “simply do not know.” Keynes would not have accepted the view of Bayesian economists, according to whom expectations “tend to be distributed, for the same information set, about the prediction of the theory.” Keynes, rather, thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief”, beliefs that have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents modeled by probabilistically reasoning Bayesian economists.

In an interesting article on his blog, **John Kay** shows that these strictures on probabilistic-reductionist reasoning do not only apply to everyday life and science, but also to the law:

English law recognises two principal standards of proof. The criminal test is that a charge must be “beyond reasonable doubt”, while civil cases are decided on “the balance of probabilities”.

The meaning of these terms would seem obvious to anyone trained in basic statistics. Scientists think in terms of confidence intervals – they are inclined to accept a hypothesis if the probability that it is true exceeds 95 per cent. “Beyond reasonable doubt” appears to be a claim that there is a high probability that the hypothesis – the defendant’s guilt – is true. Perhaps criminal conviction requires a higher standard than the scientific norm – 99 per cent or even 99.9 per cent confidence is required to throw you in jail. “On the balance of probabilities” must surely mean that the probability the claim is well founded exceeds 50 per cent.

And yet a brief conversation with experienced lawyers establishes that they do not interpret the terms in these ways. One famous illustration supposes you are knocked down by a bus, which you did not see (that is why it knocked you down). Say Company A operates more than half the buses in the town. Absent other evidence, the probability that your injuries were caused by a bus belonging to Company A is more than one half. But no court would determine that Company A was liable on that basis.

A court approaches the issue in a different way. You must tell a story about yourself and the bus. Legal reasoning uses a narrative rather than a probabilistic approach, and when the courts are faced with probabilistic reasoning the result is often a damaging muddle …

When I have raised these issues with people with scientific training, they tend to reply that lawyers are mostly innumerate and with better education would learn to think in the same way as statisticians. Probabilistic reasoning has become the dominant method of structured thinking about problems involving risk and uncertainty – to such an extent that people who do not think this way are derided as incompetent and irrational …

It is possible – common, even – to believe something is true without being confident in that belief. Or to be sure that, say, a housing bubble will burst without being able to attach a high probability to any specific event, such as “house prices will fall 20 per cent in the next year”. A court is concerned to establish the degree of confidence in a narrative, not to measure a probability in a model.

Such narrative reasoning is the most effective means humans have developed of handling complex and ill-defined problems … Probabilistic thinking … often fails when we try to apply it to idiosyncratic events and open-ended problems. We cope with these situations by telling stories, and we base decisions on their persuasiveness. Not because we are stupid, but because experience has told us it is the best way to cope. That is why novels sell better than statistics texts.

## 3 Comments

Sorry, the comment form is closed at this time.

Blog at WordPress.com.

Entries and comments feeds.

[…] Lars Syll has (once again) directed me to a fascinating piece, this time by John Kay. Kay starts the piece by noting that in a recent legal case in Britain the judge was asked to define the term “beyond reasonable doubt” and, as it typical in such cases, refused to do so. The question, however, as Kay notes was not a silly one: in English law criminal cases are decided only if the evidence is “beyond reasonable doubt” while civil cases are decided based on the “the balance of probabilities”. […]

Pingback by Probabilities: Keynesian Legal Versus Bayesian Mathematical | Fixing the Economists— 8 September, 2013 #

[…] Grasselli goes on the war path against Lars Syll. Here he simply has not read Syll’s interesting piece at all. He has merely scanned it to pick out easy targets — targets he himself constructs. You see, […]

Pingback by A Response to Matheus Grasselli on Probability and Law | Fixing the Economists— 9 September, 2013 #

Your cartoon shows some reductionist scientists and a reductionist (or may just ignorant) mathematician. It this seems to predate the insights of Whitehead and his students, Russell and Keynes, and hence what many would think of as ‘mathematics proper’. Turing’s work on morphogenesis, for example, is a mathematical model of emergence, contradicting the previous mechanical reductionist view. (E.g. http://djmarsay.wordpress.com/bibliography/debates/which-type-of-mathematics-in-finance/ ).

You are – unfortunately, correct in that the mainstream in many disciplines is still reductionist, and on my blog I try to explain why this is ‘a bad thing’. Your unemployment example is flawed, but I agree with you and Keynes that it is a little odd to use the same representation irrespective of the quality of the evidence for your estimate. For example, prior to the financial crash most people thought that the economic assessments were just as sound as ever, but they were not. Certainly, there is more to my beliefs than just numbers.

Comment by Dave Marsay— 9 September, 2013 #