Bayesianism — a scientific cul-de-sac

24 October, 2015 at 20:27 | Posted in Statistics & Econometrics | 2 Comments

wpid-bilindustriella-a86478514bOne of my favourite “problem situating lecture arguments” against Bayesianism goes something like this: Assume you’re a Bayesian turkey and hold a nonzero probability belief in the hypothesis H that “people are nice vegetarians that do not eat turkeys and that every day I see the sun rise confirms my belief.” For every day you survive, you update your belief according to Bayes’ Rule

P(H|e) = [P(e|H)P(H)]/P(e),

where evidence e stands for “not being eaten” and P(e|H) = 1. Given that there do exist other hypotheses than H, P(e) is less than 1 and a fortiori P(H|e) is greater than P(H). Every day you survive increases your probability belief that you will not be eaten. This is totally rational according to the Bayesian definition of rationality. Unfortunately — as Bertrand Russell famously noticed — for every day that goes by, the traditional Christmas dinner also gets closer and closer …

For more on my own objections to Bayesianism:
Bayesianism — a patently absurd approach to science
Bayesianism — preposterous mumbo jumbo
One of the reasons I’m a Keynesian and not a Bayesian
Keynes and Bayes in paradise



RSS feed for comments on this post. TrackBack URI

  1. I do not think this is a problem. Witness one disconfirming instance of a belief and the belief should change. If you want to get serious about the problem then you need to address awareness of the conditions that lead to confirming and disconfirming instances of a belief. The point of the Bayesian theorem is that people SHOULD update their beliefs on the basis of evidence and not predict the same old same old after multiple disconfirming instances!

    • “I do not think this is a problem.”

      Indeed it isn’t. This is the classic ‘Laplace’s Sunrises’ objection to Bayesian inference and it’s just as fallacious (and just as much of a zombie argument) as ‘Ravens’, ‘Water & Wine’ etc.

      Whenever someone puts the ‘wrong’ information into the Bayesian computer and then judges it broken for not giving them the answer which their Bayesian brain, supplied with the ‘right’ information did, a kitten dies.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at
Entries and comments feeds.