Borel’s law and the infinite monkey theorem (wonkish)

27 September, 2014 at 10:36 | Posted in Statistics & Econometrics | 2 Comments

Back in 1943, eminent French mathematician Émile Borel published a book titled Les probabilités et la vie, in which he introduced what has been called Borel’s law : “Events with a sufficiently small probability never occur.”

Borel’s law has also been called the infinite monkey theorem since Borel illustrated his thinking using the classic example with monkeys randomly hitting the keys of a typewriter and by chance producing the complete works of Shakespeare:

Such is the sort of event which, though its impossibility may not be rationally demonstrable, is, however, so unlikely that no sensible person will hesitate to declare it actually impossible. If someone affirms having observed such an event we would be sure that he is deceiving us or has himself been the victim of fraud.

034_infinite_monkey_theorem

Wikipedia gives the historical background and a proof of the theorem:

Variants of the theorem include multiple and even infinitely many typists, and the target text varies between an entire library and a single sentence. The history of these statements can be traced back to Aristotle’s On Generation and Corruption and Cicero’s De natura deorum (On the Nature of the Gods), through Blaise Pascal and Jonathan Swift, and finally to modern statements with their iconic typewriters. In the early 20th century, Émile Borel and Arthur Eddington used the theorem to illustrate the timescales implicit in the foundations of statistical mechanics.

There is a straightforward proof of this theorem. As an introduction, recall that if two events are statistically independent, then the probability of both happening equals the product of the probabilities of each one happening independently. For example, if the chance of rain in Moscow on a particular day in the future is 0.4 and the chance of an earthquake in San Francisco on that same day is 0.00003, then the chance of both happening on that day is 0.4 × 0.00003 = 0.000012, assuming that they are indeed independent.

Suppose the typewriter has 50 keys, and the word to be typed is banana. If the keys are pressed randomly and independently, it means that each key has an equal chance of being pressed. Then, the chance that the first letter typed is ‘b’ is 1/50, and the chance that the second letter typed is a is also 1/50, and so on. Therefore, the chance of the first six letters spelling banana is

(1/50) × (1/50) × (1/50) × (1/50) × (1/50) × (1/50) = (1/50)6 = 1/15 625 000 000 ,

less than one in 15 billion, but not zero, hence a possible outcome.

From the above, the chance of not typing banana in a given block of 6 letters is 1 − (1/50)6. Because each block is typed independently, the chance Xn of not typing banana in any of the first n blocks of 6 letters is

As n grows, Xn gets smaller. For an n of a million, Xn is roughly 0.9999, but for an n of 10 billion Xn is roughly 0.53 and for an n of 100 billion it is roughly 0.0017. As n approaches infinity, the probabilityXn approaches zero; that is, by making n large enough, Xn can be made as small as is desired, and the chance of typing banana approaches 100%.

The same argument shows why at least one of infinitely many monkeys will produce a text as quickly as it would be produced by a perfectly accurate human typist copying it from the original. In this case Xn = (1 − (1/50)6)n where Xn represents the probability that none of the first n monkeys types banana correctly on their first try. When we consider 100 billion monkeys, the probability falls to 0.17%, and as the number of monkeys n increases, the value of Xn – the probability of the monkeys failing to reproduce the given text – approaches zero arbitrarily closely. The limit, for n going to infinity, is zero.

However, for physically meaningful numbers of monkeys typing for physically meaningful lengths of time the results are reversed. If there are as many monkeys as there are particles in the observable universe (1080), and each types 1,000 keystrokes per second for 100 times the life of the universe (1020 seconds), the probability of the monkeys replicating even a short book is nearly zero.

Wikipedia

For more on Borel’s law and the fact that — still — incredibly unlikely things keep happening, see David Hands’s The Improbability Principle (Bantam Press, 2014).

Via con me

26 September, 2014 at 20:35 | Posted in Varia | Leave a comment

 

Prayer

26 September, 2014 at 19:11 | Posted in Varia | Leave a comment

 

Eleni Karaindrou’s breathtakingly beautiful “Prayer” from Theo Angelopoulos’s masterpiece The Weeping Meadow. It breaks my heart every time.

Senza una donna

26 September, 2014 at 17:48 | Posted in Varia | Leave a comment

 

INET — unabated faith in mathematical modelling

26 September, 2014 at 11:15 | Posted in Economics | 1 Comment

In the end, very few INET participants engage in a methodological critique that challenges the emphasis on modelling.amath One exception comes from Tony Lawson, participating at the opening conference in 2010, who is well known for his critique of the dominant economic methodology … Lawson makes an explicit link between the failure of economists to offer insights into the crisis, on the one hand, and the dominant economic methodology, on the other. In particular he points to an excessive preoccupation with mathematical modelling. Lawson’s comments below capture the intellectual tendency characterising INET events so far:

“Very many economists attended the conference, all apparently concerned critically to reconsider the nature of academic economics. It is in such a forum if anywhere that we might hope to find mainstream economists challenging all but the most obviously acceptable aspects of their theories, approaches and activities.

Although George Soros, who sponsors the Institute, shows some awareness that the reliance upon mathematics may at least be something to question … for most of his close associates the idea that there might be something problematic about the emphasis on forms of mathematical technique does not appear even to cross their minds …”

Thus, we find, to round off this section, that although INET is quite explicit about its concern with the state of economics, as well as about its search for alternatives, its overall orientation in the end (or so far) is not on a reduction in the emphasis on mathematical modelling. As things currently stand, the forum continues to show faith in the dominant economic methodological paradigm. …

Overall, we find that despite appearances, many economists across the board have tended to reaffirm their position. They do so primarily by a methodological critique that consists in advocating the development of newer, better mathematical models that this time, allegedly, achieve greater realisticness (i.e. achieve a closer match to reality), promising a greater ability to successfully predict. Representatively, Krugman adopts such a position. …

The question of whether mathematical tools are appropriate is something that, in the circumstances, we might have expected to receive significant attention. But this is not what we have found. Our study suggests rather that, even when recognising their discipline is in crisis, economists continue to take existing methodology as an unquestionable (sacrosanct) given.

Vinca Bigo & Iona Negru

The missing link in Keynes’ s General Theory

26 September, 2014 at 08:32 | Posted in Economics | Leave a comment

The cyclical succession of system states is not always clearly presented in The General Theory. In fact there are two distinct views of the business cycle, one a moderate cycle which can perhaps be identified with a dampened accelerator-multiplier cycle and the second a vigorous ‘boom and bust’ cycle … The business cycle in chapter 18 does not exhibit booms or crises …

jmkIn chapter 12 and 22, in the rebuttal to Viner, and in remarks throughout The General Theory, a vigorous cycle, which does have booms and crises, is described. However, nowhere in The General Theory or in Keynes’s few post-General Theory articles explicating his new theory are the boom and the crisis adequately defined or explained. The financial developments during a boom that makes a crisis likely, if not inevitable, are hinted at but not thoroughly examined. This is the logical hole, the missing link, in The General Theory as it was left by Keynes in 1937 after his rebuttal to Viner … In order to appreciate the full potential of The General Theory as a guide to interpretation and understanding of moderrn capitalism, we must fill out what Keynes discussed in a fragmentary and casual manner.

Further reasons to reject NAIRU lock, stock, and barrel

25 September, 2014 at 15:19 | Posted in Economics | Leave a comment

This paper has reasserted the Post Keynesian view that unemployment is essentially driven by private investment behaviour. There is a feedback from the labour market via price and wage inflation to the goods market, but it is weak. Without government policy the goods market reactions may even be perverse and, as we are presently reminded, the scope of monetary policy is limited in times of financial crises and in times of deflation. Second, the labour market itself is more adaptive than commonly assumed. The NAIRU is endogenous due to the supply-side effects of capital accumulation and the importance of social norms in wage setting. Thus, there is a well defined NAIRU that determines wage and price inflation (in conjunction with actual unemployment) in the short term, but it is endogenous and changes along with actual unemployment in the medium term. …

lock-stock-and-two-smoking-barrels-poster-bigWhile monetary policy exerts some impact on investment decisions, there may be other reasons for private investment to fall below the level necessary for full employment. Keynes himself had famously argued that it is mostly driven by animal spirits, which leaves the economic analyst in the dark as to what actually drives them. To some extent these animal spirits will depend on specific institutional structures and the degree of uncertainty regarding the future evolution of important macroeconomic variables … or corporate governance structures; but overall it is fair to say that investment expenditures cannot be easily reduced to underlying variables.

Our analysis has important policy implications. Rather than regarding the role of the state as having to provide conditions (in the labour market) as close as possible to perfect markets, our analysis highlights the role of the state as a mediator of social conflict and as a stabiliser of economic activity. If the private sector is prone to long-lasting swings in economic activity (due to changes in animal spirits or the aftermath of financial crises) and the NAIRU is endogenous, maintaining employment at high level in the short run is crucial. To that end monetary policy will in general not be sufficient and an active (counter cyclical) fiscal policy is needed. Finally, wage policy is crucial in terms of controlling inflation as well as in terms of stabilizing income distribution. Wage flexibility will not cure unemployment … Fiscal policy is the main tool of short run stabilization and wages policy aims at wages growth in line with labour productivity.

Engelbert Stockhammer

‘Natural rate of unemployment’ — a fatal fallacy

25 September, 2014 at 08:48 | Posted in Economics | Leave a comment

It is thought necessary to keep unemployment at a “non-inflation-accelerating” level (“NIARU”) in the range of 4% to 6% if inflation is to be kept from increasing unacceptably. …

William_VickreyThe underlying assumption that there is an exogenous NIARU imposing an unavoidable constraint on macroeconomic possibilities is open to serious question on both historical and analytical grounds. Historically, the U.S. enjoyed an unemployment rate of 1.8% for 1926 as a whole with the price level falling, if anything. West Germany enjoyed an unemployment rate of around 0.6% over the several years around 1960, and most developed countries have enjoyed episodes of unemployment under 2% without serious inflation. Thus a NIARU, if it exists at all, must be regarded as highly variable over time and place. It is not clear that estimates of the NIARU have not been contaminated by failure to allow for a possible impact of inflation on employment as well as the impact of unemployment on inflation. A Marxist interpretation of the insistence on a NIARU might be as a stalking horse to enlist the fear of inflation to justify the maintenance of a “reserve army of the unemployed,” allegedly to keep wages from initiating a “wage-price spiral.” One never hears of a “rent-price spiral”, or an “interest-price spiral,” though these costs are also to be considered in the setting of prices. Indeed when the FRB raises interest rates in an attempt to ward off inflation, the increase in interest costs to merchants may well trigger a small price increase. …

Indeed, if we are to control three major macroeconomic dimensions of the economy, namely the inflation rate, the unemployment rate, and the growth rate, a third control is needed that will be reasonably non-collinear in its effects to those of a fiscal policy operating through disposable income generation on the one hand, and monetary policy operating through interest rates on the other.

What may be needed is a method of directly controlling inflation that do not interfere with free market adjustments in relative prices or rely on unemployment to keep inflation in check. Without such a control, unanticipated changes in the rate of inflation, either up or down, will continue to plague the economy and make planning for investment difficult. Trying to control an economy in three major macroeconomic dimensions with only two instruments is like trying to fly an airplane with elevator and rudder but no ailerons; in calm weather and with sufficient dihedral one can manage if turns are made very gingerly, but trying to land in a cross-wind is likely to produce a crash. …

It is important to keep in mind that divergences in the rate of inflation either up or down, from what was previously expected, produce merely an arbitrary redistribution of a given total product, equivalent at worst to legitimized embezzlement, unless indeed these unpredictable variations are so extreme and rapid as to destroy the usefulness of currency as a means of exchange. Unemployment, on the other hand, reduces the total product to be distributed; it is at best equivalent to vandalism, and when it contributes to crime it becomes the equivalent of homicidal arson. In the U.S. the widespread availability of automatic teller machines in supermarkets and elsewhere would make the “shoe-leather cost” of a high but predictable inflation rate quite negligible.

William Vickrey

[h/t Jan Milch]

Ditch the NAIRU!

24 September, 2014 at 21:41 | Posted in Economics | 5 Comments

The most important implication of [the conventional NAIRU equation], however, is that there is no role whatsoever for demand factors in determining equilibrium unemployment. Any attempts by fiscal or monetary policy to permanently move (actual) unemployment away from its equilibrium level u* is doomed to failure. Policy may succeed in temporarily lowering unemployment, thus causing inflation, which in turn will undermine demand and raise unemployment until the equilibrium or “natural” rate of unemployment is reached again.

nairu

Demand will adjust itself to the “natural” level of output, corresponding to the rate of equilibrium unemployment, either passively through the so-called real balance effect or, alternatively, more actively through a policy-administered rise in interest rates; in the latter case, actual unemployment is determined by how large the central bank thinks the NAIRU is. The implication of [the conventional NAIRU equation] is that employment policy should focus exclusively on the labor market (and not on aggregate demand and investment), and above all on the behavior of labor unions and (mostly welfare state-related) wage–push factors. The policy recommendations are straightforward: to reduce unemployment, labor markets have to be deregulated; employment protection, labor taxes, and unemployment benefits have to be reduced; wage bargaining has to be decentralized; and welfare states have to be scaled down … However, although the view that labor market regulation explains OECD unemployment has become widely accepted, particularly in policy circles, it is by no means universally accepted. Serious problems remain …

Even authors working within the orthodox NAIRU approach are unable to explain (changes in long-run) unemployment in terms of only “excessive” labor market regulation. To explain (changes in) u*, most empirical studies consider it necessary to include other, additional “factors which might explain short-run deviations of unemployment from its equilibrium level” … the most important of which are aggregate demand shocks (i.e., import price and real interest rate shocks) and productivity shocks. The inclusion of such “shocks” is not an innocent amendment, because it turns out that a significant part of the OECD unemployment increase during the past three decades must be attributed to these shocks … This is obviously a dissatisfactory state of affairs: in the theoretical analysis,the impact of demand factors on equilibrium unemployment is defined away, but in the empirical analysis it has to be brought back in, not as a structural determinant but rather as an exogenous shock. We argue that this incongruence points to a misspecification of the NAIRU model.

Servaas Storm & C. W. M. Naastepad

Econometrics — still lacking a valid ontological foundation

24 September, 2014 at 15:38 | Posted in Statistics & Econometrics | 1 Comment

Important and far-reaching problems still beset regression analysis and econometrics – many of which basically are a result of an unsustainable ontological view.

complex-research-terminology-simMost econometricians have a nominalist-positivist view of science and models, according to which science can only deal with observable regularity patterns of a more or less lawlike kind. Only data matters and trying to (ontologically) go beyond observed data in search of underlying real factors and relations that generate the data is not admissable. All has to take place in the model of the econometric mind, since the real factors and relations according to the econometric (epistemologically based) methodology are beyond reach, since they, allegedly, are both unobservable and unmeasurable. This also means that instead of treating the model-based findings as interesting clues for digging deepeer into real structures and mechanisms, they are treated as the end points of the investigation.

As mathematical statistician David Freedman writes in Statistical Models and Causal Inference (2010):

In my view, regression models are not a particularly good way of doing empirical work in the social sciences today, because the technique depends on knowledge that we do not have. Investigators who use the technique are not paying adequate attention to the connection – if any – between the models and the phenomena they are studying. Their conclusions may be valid for the computer code they have created, but the claims are hard to transfer from that microcosm to the larger world …

freedman2Given the limits to present knowledge, I doubt that models can be rescued by technical fixes. Arguments about the theoretical merit of regression or the asymptotic behavior of specification tests for picking one version of a model over another seem like the arguments about how to build desalination plants with cold fusion and the energy source. The concept may be admirable, the technical details may be fascinating, but thirsty people should look elsewhere …

Causal inference from observational data presents may difficulties, especially when underlying mechanisms are poorly understood. There is a natural desire to substitute intellectual capital for labor, and an equally natural preference for system and rigor over methods that seem more haphazard. These are possible explanations for the current popularity of statistical models.

Indeed, far-reaching claims have been made for the superiority of a quantitative template that depends on modeling – by those who manage to ignore the far-reaching assumptions behind the models. However, the assumptions often turn out to be unsupported by the data. If so, the rigor of advanced quantitative methods is a matter of appearance rather than substance.

If econometrics is to progress it has to abandon its outdated nominalist-positivist view of science and the belief that science can only deal with observable regularity patterns of a more or less law-like kind. Scientific theories ought to do more than just describe event-regularities and patterns – they also have to analyze and describe the mechanisms, structures, and processes that give birth to these patterns and eventual regularities.

« Previous PageNext Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.