Borel’s law and the infinite monkey theorem (wonkish)

27 Sep, 2014 at 10:36 | Posted in Statistics & Econometrics | 3 Comments

Back in 1943, eminent French mathematician Émile Borel published a book titled Les probabilités et la vie, in which he introduced what has been called Borel’s law : “Events with a sufficiently small probability never occur.”

Borel’s law has also been called the infinite monkey theorem since Borel illustrated his thinking using the classic example with monkeys randomly hitting the keys of a typewriter and by chance producing the complete works of Shakespeare:

Such is the sort of event which, though its impossibility may not be rationally demonstrable, is, however, so unlikely that no sensible person will hesitate to declare it actually impossible. If someone affirms having observed such an event we would be sure that he is deceiving us or has himself been the victim of fraud.

034_infinite_monkey_theorem

Wikipedia gives the historical background and a proof of the theorem:

Variants of the theorem include multiple and even infinitely many typists, and the target text varies between an entire library and a single sentence. The history of these statements can be traced back to Aristotle’s On Generation and Corruption and Cicero’s De natura deorum (On the Nature of the Gods), through Blaise Pascal and Jonathan Swift, and finally to modern statements with their iconic typewriters. In the early 20th century, Émile Borel and Arthur Eddington used the theorem to illustrate the timescales implicit in the foundations of statistical mechanics.

There is a straightforward proof of this theorem. As an introduction, recall that if two events are statistically independent, then the probability of both happening equals the product of the probabilities of each one happening independently. For example, if the chance of rain in Moscow on a particular day in the future is 0.4 and the chance of an earthquake in San Francisco on that same day is 0.00003, then the chance of both happening on that day is 0.4 × 0.00003 = 0.000012, assuming that they are indeed independent.

Suppose the typewriter has 50 keys, and the word to be typed is banana. If the keys are pressed randomly and independently, it means that each key has an equal chance of being pressed. Then, the chance that the first letter typed is ‘b’ is 1/50, and the chance that the second letter typed is a is also 1/50, and so on. Therefore, the chance of the first six letters spelling banana is

(1/50) × (1/50) × (1/50) × (1/50) × (1/50) × (1/50) = (1/50)6 = 1/15 625 000 000 ,

less than one in 15 billion, but not zero, hence a possible outcome.

From the above, the chance of not typing banana in a given block of 6 letters is 1 − (1/50)6. Because each block is typed independently, the chance Xn of not typing banana in any of the first n blocks of 6 letters is

As n grows, Xn gets smaller. For an n of a million, Xn is roughly 0.9999, but for an n of 10 billion Xn is roughly 0.53 and for an n of 100 billion it is roughly 0.0017. As n approaches infinity, the probabilityXn approaches zero; that is, by making n large enough, Xn can be made as small as is desired, and the chance of typing banana approaches 100%.

The same argument shows why at least one of infinitely many monkeys will produce a text as quickly as it would be produced by a perfectly accurate human typist copying it from the original. In this case Xn = (1 − (1/50)6)n where Xn represents the probability that none of the first n monkeys types banana correctly on their first try. When we consider 100 billion monkeys, the probability falls to 0.17%, and as the number of monkeys n increases, the value of Xn – the probability of the monkeys failing to reproduce the given text – approaches zero arbitrarily closely. The limit, for n going to infinity, is zero.

However, for physically meaningful numbers of monkeys typing for physically meaningful lengths of time the results are reversed. If there are as many monkeys as there are particles in the observable universe (1080), and each types 1,000 keystrokes per second for 100 times the life of the universe (1020 seconds), the probability of the monkeys replicating even a short book is nearly zero.

Wikipedia

For more on Borel’s law and the fact that — still — incredibly unlikely things keep happening, see David Hands’s The Improbability Principle (Bantam Press, 2014).

3 Comments

  1. It’ possible to attack the problem like a sequence of Cauchy.
    A sequence

    x_1, x_2, x_3, …

    of real numbers is called a Cauchy sequence, if for every positive real number ε, there is a positive integer N such that for all natural numbers m, n > N

    |x_m – x_n| < epsilon,

    where the vertical bars denote the absolute value.

    for show 0 probability, take a succession with limit 0 that is major in each step to probability of rare event.

  2. I always figured the “monkeys typing Shakespeare” thing was a way to help people understand the concept of infinity. When any sensible person without hesitation declares it impossible, I say but the monkeys are not done typing yet.

    Unrelated, but Nicholas Werle’s More than a sum of its parts: A Keynesian epistemology of statistics might be relevant to your discussions of microfoundations.

    http://runningthezoo.com/Running_The_Zoo/A_Keynesian_Epistemology_of_Statistics_files/Werle%20-%20Keynes%27%20Epistemology%20of%20Statistics%20%28JPE%29.pdf

    “The major theoretical insight of Keynes’ General Theory is that aggregate quantities describing the state of an economy as a whole are irreducible to arithmetic summations of individual decisions.”

  3. […] random arrangements of letters could create any form including Shakespeare’s works. In contrast, Emile Borel in 1943 stated Borel’s law that “events with a sufficiently small probability never occur.” […]


Sorry, the comment form is closed at this time.

Blog at WordPress.com.
Entries and comments feeds.