Modern macroeconomics – like Hamlet without the Prince

31 May, 2013 at 19:13 | Posted in Economics | Comments Off on Modern macroeconomics – like Hamlet without the Prince

Something is rotten in the state of macroeconomics … 

And its miserable state should come as no surprise for those of you who regularly follow yours truly’s blog.

hamletForecasting is by its nature a hit-and-miss affair; economics is not—despite the apparent dogmatic certainty of some of its practitioners—an exact science. But the track record of the profession in recent years—and last year in particular —is dire. Few economists spotted the boom and most hopelessly underestimated the bust. And it’s not as if the profession’s troubles in 2012 were limited to longer-range forecasts; it was getting it wrong virtually in real time with most forecasters forced to slash their projections every few months as each quarter turned out worse than expected …

What the dismal science’s dismal record suggests is that there is something profoundly wrong with the mainstream economics profession’s understanding of how modern economies work. The models on which its forecasts are built are clearly badly flawed …

But the most important contribution to the debate is an essay by Claudio Borio, deputy head of the monetary and economics department at the Bank for International Settlements, published last moth and titled: “The Financial Cycle and Macroeconomics: What have we learned?”

In Mr. Borio’s view, the “New Keynesian Dynamic Stochastic General Equilibrium” model used by most mainstream forecasters is flawed because it assumes the financial system is frictionless: Its role is simply to allocate resources and therefore can be ignored. Although many economists now accept these assumptions are wrong, efforts to modify their models amount to little more than tinkering. What is needed is a return to out-of-fashion insights influential before World War II and kept alive since by maverick economists such as Hyman Minsky and Charles Kindleberger that recognized the central importance of the financial cycle.

Mainstream economists have been so fixated on understanding ordinary business cycles that they ignored the role that years of rising asset prices and financial sector liberalization can play in fueling credit booms. They lost sight of the fact that the financial system does more than allocate resources: It creates money—and therefore purchasing power—every time it extends a loan.

“Macroeconomics without the financial cycle is like Hamlet without the Prince,” to Mr. Borio.

Simon Nixon


My new book is out

30 May, 2013 at 07:48 | Posted in Theory of Science & Methodology | Comments Off on My new book is out

what is theoryEconomics is a discipline with the avowed ambition to produce theory for the real world. But it fails in this ambition, Lars Pålsson Syll asserts in Chapter 12, at least as far as the dominant mainstream neoclassical economic theory is concerned. Overly confident in deductivistic Euclidian methodology, neoclassical economic theory lines up series of mathematical models that display elaborate internal consistency but lack clear counterparts in the real world. Such models are at best unhelpful, if not outright harmful, and it is time for economic theory to take a critical realist perspective and explain economic life in depth rather than merely modeling it axiomatically.

The state of economic theory is not as bad as Pålsson Syll describes, Fredrik Hansen retorts in Chapter 13. Looking outside the mainstream neoclassic tradition, one can find numerous economic perspectives that are open to other disciplines and manifest growing interest in methodological matters. He is confident that theoretical and methodological pluralism will be able to refresh the debate on economic theory, particularly concerning the nature of realism in economic theory, a matter about which Pålsson Syll and Hansen clearly disagree.

What is theory? consists of a multidisciplinary collection of essays that are tied together by a common effort to tell what theory is, and paired as dialogues between senior and junior researchers from the same or allied disciplines to add a trans-generational dimension to the book’s multidisciplinary approach.

The book has mainly been designed for master’s degree students and postgraduates in the social sciences and the humanities.

On the impossibility of predicting the future

28 May, 2013 at 20:02 | Posted in Statistics & Econometrics | 1 Comment


Capturing causality in economics (wonkish)

27 May, 2013 at 14:12 | Posted in Economics, Theory of Science & Methodology | Comments Off on Capturing causality in economics (wonkish)

A few years ago Armin Falk and James Heckman published an acclaimed article titled “Lab Experiments Are a Major Source of Knowledge in the Social Sciences” in the journal Science. The authors – both renowned economists – argued that both field experiments and laboratory experiments are basically facing the same problems in terms of generalizability and external validity – and that a fortiori it is impossible to say that one would be better than the other.

What strikes me when reading both Falk & Heckman and advocators of field experiments – such as John List and Steven Levitt – is that field studies and experiments are both very similar to theoretical models. They all have the same basic problem – they are built on rather artificial conditions and have difficulties with the “trade-off” between internal and external validity. The more artificial conditions, the more internal validity, but also less external validity. The more we rig experiments/field studies/models to avoid the “confounding factors”, the less the conditions are reminicent of the real “target system”. To that extent, I also believe that Falk & Heckman are right in their comments on the discussion of the field vs. experiments in terms of realism – the nodal issue is not about that, but basically about how economists using different isolation strategies in different “nomological machines” attempt to learn about causal relationships. By contrast to Falk & Heckman and advocators of field experiments, as List and Levitt, I doubt the generalizability of both research strategies, because the probability is high that causal mechanisms are different in different contexts and that lack of homogeneity/ stability/invariance doesn’t give us warranted export licenses to the “real” societies or economies.

If you mainly conceive of experiments or field studies as heuristic tools, the dividing line between, say, Falk & Heckman and List or Levitt is probably difficult to perceive.

But if we see experiments or field studies as theory tests or models that ultimately aspire to say something about the real “target system”, then the problem of external validity is central (and was for a long time also a key reason why behavioural economists had trouble getting their research results published).

Assume that you have examined how the work performance of Chinese workers A is affected by B (“treatment”). How can we extrapolate/generalize to new samples outside the original population (e.g. to the US)? How do we know that any replication attempt “succeeds”? How do we know when these replicated experimental results can be said to justify inferences made in samples from the original population? If, for example, P(A|B) is the conditional density function for the original sample, and we are interested in doing a extrapolative prediction of E [P(A|B)], how can we know that the new sample’s density function is identical with the original? Unless we can give some really good argument for this being the case, inferences built on P(A|B) is not really saying anything on that of the target system’s P'(A|B).

As I see it is this heart of the matter. External validity/extrapolation/generalization is founded on the assumption that we could make inferences based on P(A|B) that is exportable to other populations for which P'(A|B) applies. Sure, if one can convincingly show that P and P’are similar enough, the problems are perhaps surmountable. But arbitrarily just introducing functional specification restrictions of the type invariance/stability /homogeneity, is, at least for an epistemological realist far from satisfactory. And often it is – unfortunately – exactly this that I see when I take part of neoclassical economists’ models/experiments/field studies.

By this I do not mean to say that empirical methods per se are so problematic that they can never be used. On the contrary, I am basically – though not without reservations – in favour of the increased use of experiments and field studies within economics. Not least as an alternative to completely barren “bridge-less” axiomatic-deductive theory models. My criticism is more about aspiration levels and what we believe that we can achieve with our mediational epistemological tools and methods in the social sciences.

Many ‘experimentalists’ claim that it is easy to replicate experiments under different conditions and therefore a fortiori easy to test the robustness of experimental results. But is it really that easy? If in the example given above, we run a test and find that our predictions were not correct – what can we conclude? The B “works” in China but not in the US? Or that B “works” in a backward agrarian society, but not in a post-modern service society? That B “worked” in the field study conducted in year 2008 but not in year 2012? Population selection is almost never simple. Had the problem of external validity only been about inference from sample to population, this would be no critical problem. But the really interesting inferences are those we try to make from specific labs/experiments/fields to specific real world situations/institutions/structures that we are interested in understanding or (causally) to explain. And then the population problem is more difficult to tackle.

Evidence-based theories and policies are highly valued nowadays. Randomization is supposed to best control for bias from unknown confounders. The received opinion is that evidence based on randomized experiments therefore is the best.

More and more economists have also lately come to advocate randomization as the principal method for ensuring being able to make valid causal inferences.

Renowned econometrician Ed Leamer has responded to these allegations, maintaning that randomization is not sufficient, and that the hopes of a better empirical and quantitative macroeconomics are to a large extent illusory. Randomization – just as econometrics – promises more than it can deliver, basically because it requires assumptions that in practice are not possible to maintain:

We economists trudge relentlessly toward Asymptopia, where data are unlimited and estimates are consistent, where the laws of large numbers apply perfectly andwhere the full intricacies of the economy are completely revealed. But it’s a frustrating journey, since, no matter how far we travel, Asymptopia remains infinitely far away. Worst of all, when we feel pumped up with our progress, a tectonic shift can occur, like the Panic of 2008, making it seem as though our long journey has left us disappointingly close to the State of Complete Ignorance whence we began.

The pointlessness of much of our daily activity makes us receptive when the Priests of our tribe ring the bells and announce a shortened path to Asymptopia … We may listen, but we don’t hear, when the Priests warn that the new direction is only for those with Faith, those with complete belief in the Assumptions of the Path. It often takes years down the Path, but sooner or later, someone articulates the concerns that gnaw away in each of us and asks if the Assumptions are valid … Small seeds of doubt in each of us inevitably turn to despair and we abandon that direction and seek another …

Ignorance is a formidable foe, and to have hope of even modest victories, we economists need to use every resource and every weapon we can muster, including thought experiments (theory), and the analysis of data from nonexperiments, accidental experiments, and designed experiments. We should be celebrating the small genuine victories of the economists who use their tools most effectively, and we should dial back our adoration of those who can carry the biggest and brightest and least-understood weapons. We would benefit from some serious humility, and from burning our “Mission Accomplished” banners. It’s never gonna happen.

Part of the problem is that we data analysts want it all automated. We want an answer at the push of a button on a keyboard …  Faced with the choice between thinking long and hard verus pushing the button, the single button is winning by a very large margin.

Let’s not add a “randomization” button to our intellectual keyboards, to be pushed without hard reflection and thought.

Especially when it comes to questions of causality, randomization is nowadays considered some kind of “gold standard”. Everything has to be evidence-based, and the evidence has to come from randomized experiments.

But just as econometrics, randomization is basically a deductive method. Given  the assumptions (such as manipulability, transitivity, Reichenbach probability principles, separability, additivity, linearity etc)  these methods deliver deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. [And although randomization may contribute to controlling for confounding, it does not guarantee it, since genuine ramdomness presupposes infinite experimentation and we know all real experimentation is finite. And even if randomization may help to establish average causal effects, it says nothing of individual effects unless homogeneity is added to the list of assumptions.] Real target systems are seldom epistemically isomorphic to our axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of  the conclusions reached from within these epistemically convenient models/systems. Causal evidence generated by randomization procedures may be valid in “closed” models, but what we usually are interested in, is causal evidence in the real target system we happen to live in.

When does a conclusion established in population X hold for target population Y? Only under  very restrictive conditions!

Ideally controlled experiments (still the benchmark even for natural and quasi experiments) tell us with certainty what causes what effects – but only given the right “closures”. Making appropriate extrapolations from (ideal, accidental, natural or quasi) experiments to different settings, populations or target systems, is not easy. “It works there” is no evidence for “it will work here”. Causes deduced in an experimental setting still have to show that they come with an export-warrant to the target population/system. The causal background assumptions made have to be justified, and without licenses to export, the value of  “rigorous” and “precise” methods is despairingly small.

Here I think Leamer’s “button” metaphor is appropriate. Many advocates of randomization want  to have deductively automated answers to  fundamental causal questions. But to apply “thin” methods we have to have “thick” background knowledge of  what’s going on in the real world, and not in (ideally controlled) experiments. Conclusions  can only be as certain as their premises – and that also goes for methods based on randomized experiments.

Harvard Statistics 110 – some classic probability problems

26 May, 2013 at 11:44 | Posted in Statistics & Econometrics | Comments Off on Harvard Statistics 110 – some classic probability problems


New Keynesianism and DSGE – intellectually bankrupt enterprises

26 May, 2013 at 09:16 | Posted in Economics | Comments Off on New Keynesianism and DSGE – intellectually bankrupt enterprises

In modern neoclassical macroeconomics – Dynamic Stochastic General Equilibrium (DSGE), New Synthesis, New Classical and “New Keynesian” – variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore allegedly have access to heaps of historical time-series.

Modern macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are genuinely uncertaint. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

Some macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption.

Fortunately – when you’ve got tired of  the kind of gobsmacking macroeconomics apologetics produced by so called “New Keynesian” macroeconomists – there still are some  real Keynesian macroeconomists to read!

One of them, Axel Leijonhufvud, I last met  a couple of years ago in Copenhagen, where we were invited keynote speakers at the conference “Keynes 125 Years – What Have We Learned?” Axel’s speech was later published as Keynes and the crisis and contains some very profound insights and antidotes to DSGE modeling and New “Keynesianism” :

So far I have argued that recent events should force us to re-examine recent monetary policy doctrine. Do we also need to reconsider modern macroeconomic theory in general? I should think so. Consider briefly a few of the issues.

axel_leijonhufvudThe real interest rate … The problem is that the real interest rate does not exist in reality but is a constructed variable. What does exist is the money rate of interest from which one may construct a distribution of perceived real interest rates given some distribution of inflation expectations over agents. Intertemporal non-monetary general equilibrium (or finance) models deal in variables that have no real world counterparts. Central banks have considerable influence over money rates of interest as demonstrated, for example, by the Bank of Japan and now more recently by the Federal Reserve …

The representative agent. If all agents are supposed to have rational expectations, it becomes convenient to assume also that they all have the same expectation and thence tempting to jump to the conclusion that the collective of agents behaves as one. The usual objection to representative agent models has been that it fails to take into account well-documented systematic differences in behaviour between age groups, income classes, etc. In the financial crisis context, however, the objection is rather that these models are blind to the consequences of too many people doing the same thing at the same time, for example, trying to liquidate very similar positions at the same time. Representative agent models are peculiarly subject to fallacies of composition. The representative lemming is not a rational expectations intertemporal optimising creature. But he is responsible for the fat tail problem that macroeconomists have the most reason to care about …

For many years now, the main alternative to Real Business Cycle Theory has been a somewhat loose cluster of models given the label of New Keynesian theory. New Keynesians adhere on the whole to the same DSGE modeling technology as RBC macroeconomists but differ in the extent to which they emphasise inflexibilities of prices or other contract terms as sources of shortterm adjustment problems in the economy. The “New Keynesian” label refers back to the “rigid wages” brand of Keynesian theory of 40 or 50 years ago. Except for this stress on inflexibilities this brand of contemporary macroeconomic theory has basically nothing Keynesian about it.

The obvious objection to this kind of return to an earlier way of thinking about macroeconomic problems is that the major problems that have had to be confronted in the last twenty or so years have originated in the financial markets – and prices in those markets are anything but “inflexible”. But there is also a general theoretical problem that has been festering for decades with very little in the way of attempts to tackle it. Economists talk freely about “inflexible” or “rigid” prices all the time, despite the fact that we do not have a shred of theory that could provide criteria for judging whether a particular price is more or less flexible than appropriate to the proper functioning of the larger system. More than seventy years ago, Keynes already knew that a high degree of downward price flexibility in a recession could entirely wreck the financial system and make the situation infinitely worse. But the point of his argument has never come fully to inform the way economists think about price inflexibilities …

I began by arguing that there are three things we should learn from Keynes … The third was to ask whether events provedt hat existing theory needed to be revised. On that issue, I conclude that dynamic stochastic general equilibrium theory has shown itself an intellectually bankrupt enterprise. But this does not mean that we should revert to the old Keynesian theory that preceded it (or adopt the New Keynesian theory that has tried to compete with it). What we need to learn from Keynes, instead, are these three lessons about how to view our responsibilities and how to approach our subject.

Are economists rational?

25 May, 2013 at 15:48 | Posted in Economics | 1 Comment

nate silverNow consider what happened in November 2007. It was just one month before the Great Recession officially began …

Economists in the Survey of Professional Forecasters, a quarterly poll put out by the Federal Reserve Bank of Philadelphia, nevertheless foresaw a recession as relatively unlikely. Intead, they expected the economy to grow at a just slightly below average rate of 2.4 percent in 2008 … This was a very bad forecast: GDP actually shrank by 3.3 percent once the financial crisis hit. What may be worse is that the economists were extremely confident in their prediction. They assigned only a 3 percent chance to the economy’s shrinking by any margin over the whole of 2008 …

Indeed, economists have for a long time been much to confident in their ability to predict the direction of the economy … Their predictions have not just been overconfident but also quite poor in a real-world sense … Economic forecasters get more feedback than people in most other professions, but they haven’t chosen to correct for their bias toward overconfidence.

Macroeconomic fallacies

24 May, 2013 at 10:10 | Posted in Economics | 2 Comments

Fallacy 8

If deficits continue, the debt service would eventually swamp the fisc.

Real prospect: While viewers with alarm are fond of horror-story projections in which per capita debt would become intolerably burdensome, debt service would absorb the entire income tax revenue, or confidence is lost in the ability or willingness of the government to levy the required taxes so that bonds cannot be marketed on reasonable terms, reasonable scenarios protect a negligible or even favorable effect on the fisc … A fifteen trillion debt will be far easier to deal with out of a full employment economy with greatly reduced needs for unemployment benefits and welfare payments than a five trillion debt from an economy in the doldrums with its equipment in disrepair. There is simply no problem …

Fallacy 14

Government debt is thought of as a burden handed on from one generation to its children and grandchildren.

Reality: Quite the contrary, in generational terms, (as distinct from time slices) the debt is the means whereby the present working cohorts are enabled to earn more by fuller employment and invest in the increased supply of assets, of which the debt is a part, so as to provide for their own old age. In this way the children and grandchildren are relieved of the burden of providing for the retirement of the preceding generations, whether on a personal basis or through government programs.

This fallacy is another example of zero-sum thinking that ignores the possibility of increased employment and expanded output …


These fallacious notions, which seem to be widely held in various forms by those close to the seats of economic power, are leading to policies that are not only cruel but unnecessary and even self-defeating in terms of their professed objectives …

We will not get out of the economic doldrums as long as we continue to be governed by fallacious notions that are based on false analogies, one-sided analysis, and an implicit underlying counterfactual assumption of an inevitable level of unemployment …

If a budget balancing program should actually be carried through, the above analysis indicates that sooner or later a crash comparable to that of 1929 would almost certainly result … To assure against such a disaster and start on the road to real prosperity it is necessary to relinquish our unreasoned ideological obsession with reducing government deficits, recognize that it is the economy and not the government budget that needs balancing in terms of the demand for and supply of assets, and proceed to recycle attempted savings into the income stream at an adequate rate, so that they will not simply vanish in reduced income, sales, output and employment. There is too a free lunch out there, indeed a very substantial one. But it will require getting free from the dogmas of the apostles of austerity, most of whom would not share in the sacrifices they recommend for others. Failing this we will all be skating on very thin ice.

William Vickrey Fifteen Fatal Fallacies of Financial Fundamentalism

Sweden’s equality fades away

23 May, 2013 at 16:40 | Posted in Economics | 1 Comment

SwedenGini1980to2011           Source

Sweden, which has long been the shining example for liberal economists of what we should be aiming for, seems to be losing its luster.

That’s because the growth in Swedish inequality between 1985 and the late 2000s was the largest among all OECD countries, increasing by one third:

Sweden has seen the steepest increase in inequality over 15 years amongst the 34 OECD nations, with disparities rising at four times the pace of the United States, the think tank said.

Once the darling of the political left, heavy state control and wealth distribution through high taxes and generous benefits gave the country’s have-nots an enviable standard of living at the expense of the wealthiest members of society.

Although still one of the most equal countries in the world, the last two decades have seen a marked change. Market reforms have helped the economy become one of Europe’s best performers but this has Swedes wondering if their love affair with state welfare was coming to an end.

The real tipping point came in 2006 when the centre-right government swept to power, bringing an end to a Social Democratic era which stretched for most of the 20th century.

Swedes had grown increasingly weary of their high taxes and with more jobs going overseas, the new government laid out a plan to fine-tune the old welfare system. It slashed income taxes, sold state assets and tried to make it pay to work.

Spending on welfare benefits such as pensions, unemployment and incapacity assistance has fallen by almost a third to 13 percent of GDP from the early nineties, putting Sweden only just above the 11 percent OECD average.

At the other end of the spectrum, tax changes and housing market reforms have made the rich richer.

Since the mid-80s, income from savings, private pensions or rentals, jumped 10 percent for the richest fifth of the population while falling one percent for the poorest 20 percent.

David Ruccio

My favourite drug

23 May, 2013 at 10:27 | Posted in Varia | Comments Off on My favourite drug


IS-LM basics in less than 16 minutes

22 May, 2013 at 15:53 | Posted in Economics | 5 Comments


Lördagmorgon i P2

20 May, 2013 at 14:38 | Posted in Varia | Comments Off on Lördagmorgon i P2

I dessa tider – när ljudrummet dränks i den kommersiella radions tyckmyckentrutade ordbajseri och Melodifestivalens fullständigt intetsägande skval – har man ju nästan gett upp.

Men det finns ljus i mörkret! I radions P2 går varje lördagmorgon ett vederkvickelsens och den seriösa musikens Lördagmorgon i P2.

Så passa på och börja dagen med en musikalisk örontvätt och rensa hörselgångarna från kvarvarande musikslagg. Här kunde man i lördags till exempel lyssna på musik av Vassilis Tsabropoulos, Max Richter och Emerson String Quartet. Att i tre timmar få lyssna till sådan musik ger sinnet ro och får hoppet att återvända. Tack public-service-radio.

Och tack Erik Schüldt. Att i tre timmar få lyssna till underbar musik och en programledare som har något att säga och inte bara låter foderluckan glappa hela tiden – vilken lisa för själen!

Lars E O Svensson om penningpolitik och bostadsbubblor

19 May, 2013 at 17:32 | Posted in Economics | 1 Comment

leoI morgon gör Lars E O Svensson sin sista arbetsdag på Riksbanken. I den här intervjun ser Svensson – Sveriges internationellt mest ansedde nationalekonom – tillbaka på sina sex år som vice riksbankschef. Intressant!

Further suggestions for Krugman’s IS-LM reading list

17 May, 2013 at 14:56 | Posted in Economics | 4 Comments

The determination of investment is a four-stage process in The General Theory. Money and debts determine an “interest rate”; long-term expectations determine the yield – or expected cash flows – from capital assets and current investment (i.e., the capital stock); the yield and the interest rate enter into the determination of the price of capital assets; and investment is carried to the point where the supply price of investment output equals the capitalized value of the yield. The simple IS-LM framework violates the complexity of the investment-determning process as envisaged by Keynes …

minskys keynesbokThe Hicks-Hansen model, by making explicit the interdependence of the commodity and money markets in Keynes’s thought, is a more accurate representation of his views than the simple consumption-function models. Nevertheless, because it did not explicitly consider the significance of uncertainty in both portfolio decisions and investment behavior, and becasue it was an equilibrium rather than a process interpretation of the model, it was an unfair and naïve representation of Keynes’s subtle and sophisticated views …

The journey through various standard models that embody elements derived from The General Theory has led us to the position that such Keynesian models are either trivial (the consumption-function models), incomplete (the IS-LM models without a labor market), inconsistent (the IS-LM models with a labor market but no real-balance effect), or indistinguishable in their results from those of older quantity-theory models (the neoclassical synthesis).

As we all know Paul Krugman is very fond of referring to and defending the old and dear IS-LM model.

John Hicks, the man who invented it in his 1937 Econometrica review of Keynes’ General TheoryMr. Keynes and the ‘Classics’. A Suggested Interpretation – returned to it in an article in 1980 – IS-LM: an explanation – in Journal of Post Keynesian Economics. Self-critically he wrote:

I accordingly conclude that the only way in which IS-LM analysis usefully survives — as anything more than a classroom gadget, to be superseded, later on, by something better – is in application to a particular kind of causal analysis, where the use of equilibrium methods, even a drastic use of equilibrium methods, is not inappropriate …

When one turns to questions of policy, looking toward the future instead of the past, the use of equilibrium methods is still more suspect. For one cannot prescribe policy without considering at least the possibility that policy may be changed. There can be no change of policy if everything is to go on as expected-if the economy is to remain in what (however approximately) may be regarded as its existing equilibrium. It may be hoped that, after the change in policy, the economy will somehow, at some time in the future, settle into what may be regarded, in the same sense, as a new equilibrium; but there must necessarily be a stage before that equilibrium is reached …

It is well known that in later developments of Keynesian theory, the long-term rate of interest (which does figure, excessively, in Keynes’ own presentation and is presumably represented by the r of the diagram) has been taken down a peg from the position it appeared to occupy in Keynes. We now know that it is not enough to think of the rate of interest as the single link between the financial and industrial sectors of the economy; for that really implies that a borrower can borrow as much as he likes at the rate of interest charged, no attention being paid to the security offered. As soon as one attends to questions of security, and to the financial intermediation that arises out of them, it becomes apparent that the dichotomy between the two curves of the IS-LM diagram must not be pressed too hard.

Back in 1937 John Hicks said that he was building a model of John Maynard Keynes’ General Theory. He wasn’t.

What Hicks acknowledges in 1980 – confirming Minsky’s critique – is basically that his original review totally ignored the very core of Keynes’ theory – uncertainty. Ignoring uncertainty, he had actually contributed to turning the train of macroeconomics on the wrong tracks for decades.

It’s about time that neoclassical economists – as Krugman, Mankiw, or what have you – set the record straight and stop promoting something that the creator himself admits was a total failure. Why not study the real thing itself – General Theory – in full and without looking the other way when it comes to non-ergodicity and uncertainty?

Flawed macroeconomic models

17 May, 2013 at 13:39 | Posted in Economics, Theory of Science & Methodology | 3 Comments

stiglitz3If we had begun our reform efforts with a focus on how to make our economy more efficient and more stable, there are other questions we would have naturally asked; other questions we would have posed. Interestingly, there is some correspondence between these deficiencies in our reform efforts and the deficiencies in the models that we as economists often use in macroeconomics.

•First, the importance of credit
We would, for instance, have asked what the fundamental roles of the financial sector are, and how we can get it to perform those roles better. Clearly, one of the key roles is the allocation of capital and the provision of credit, especially to small and medium-sized enterprises, a function which it did not perform well before the crisis, and which arguably it is still not fulfilling well.

This might seem obvious. But a focus on the provision of credit has neither been at the centre of policy discourse nor of the standard macro-models. We have to shift our focus from money to credit. In any balance sheet, the two sides are usually going to be very highly correlated. But that is not always the case, particularly in the context of large economic perturbations. In these, we ought to be focusing on credit. I find it remarkable the extent to which there has been an inadequate examination in standard macro models of the nature of the credit mechanism. There is, of course, a large microeconomic literature on banking and credit, but for the most part, the insights of this literature has not been taken on board in standard macro-models …

•Second, stability
As I have already noted, in the conventional models (and in the conventional wisdom) market economies were stable. And so it was perhaps not a surprise that fundamental questions about how to design more stable economic systems were seldom asked. We have already touched on several aspects of this: how to design economic systems that are less exposed to risk or that generate less volatility on their own.

One of the necessary reforms, but one not emphasised enough, is the need for more automatic stabilisers and fewer automatic destabilisers – not only in the financial sector, but throughout the economy. For instance, the movement from defined benefit to defined contribution systems may have led to a less stable economy …

•Third, distribution
Distribution matters as well – distribution among individuals, between households and firms, among households, and among firms. Traditionally, macroeconomics focused on certain aggregates, such as the average ratio of leverage to GDP. But that and other average numbers often don’t give a picture of the vulnerability of the economy.

In the case of the financial crisis, such numbers didn’t give us warning signs. Yet it was the fact that a large number of people at the bottom couldn’t make their debt payments that should have tipped us off that something was wrong …

•Fourth, policy frameworks
Flawed models not only lead to flawed policies, but also to flawed policy frameworks.

Should monetary policy focus just on short-term interest rates? In monetary policy, there is a tendency to think that the central bank should only intervene in the setting of the short-term interest rate. They believe ‘one intervention’ is better than many. Since at least eighty years ago, with the work of Frank Ramsey, we know that focusing on a single instrument is not generally the best approach.

The advocates of the ‘single intervention’ approach argue that it is best, because it least distorts the economy. Of course, the reason we have monetary policy in the first place – the reason why government acts to intervene in the economy – is that we don’t believe that markets on their own will set the right short-term interest rate. If we did, we would just let free markets determine that interest rate. The odd thing is that while just about every central banker would agree we should intervene in the determination of that price, not everyone is so convinced that we should strategically intervene in others, even though we know from the general theory of taxation and the general theory of market intervention that intervening in just one price is not optimal.

Once we shift the focus of our analysis to credit, and explicitly introduce risk into the analysis, we become aware that we need to use multiple instruments. Indeed, in general, we want to use all the instruments at our disposal. Monetary economists often draw a division between macro-prudential, micro-prudential, and conventional monetary policy instruments. In our book Towards a New Paradigm in Monetary Economics, Bruce Greenwald and I argue that this distinction is artificial. The government needs to draw upon all of these instruments, in a coordinated way …

Of course, we cannot ‘correct’ every market failure. The very large ones, however – the macroeconomic failures – will always require our intervention. Bruce Greenwald and I have pointed out that markets are never Pareto efficient if information is imperfect, if there are asymmetries of information, or if risk markets are imperfect. And since these conditions are always satisfied, markets are never Pareto efficient. Recent research has highlighted the importance of these and other related constraints for macroeconomics – though again, the insights of this important work have yet to be adequately integrated either into mainstream macroeconomic models or into mainstream policy discussions.

•Fifth, price versus quantitative interventions
These theoretical insights also help us to understand why the old presumption among some economists that price interventions are preferable to quantity interventions is wrong. There are many circumstances in which quantity interventions lead to better economic performance.

A policy framework that has become popular in some circles argues that so long as there are as many instruments as there are objectives, the economic system is controllable, and the best way of managing the economy in such circumstances is to have an institution responsible for one target and one instrument. (In this view, central banks have one instrument – the interest rate – and one objective – inflation. We have already explained why limiting monetary policy to one instrument is wrong.)

Drawing such a division may have advantages from an agency or bureaucratic perspective, but from the point of view of managing macroeconomic policy – focusing on growth, stability and distribution, in a world of uncertainty – it makes no sense. There has to be coordination across all the issues and among all the instruments that are at our disposal. There needs to be close coordination between monetary and fiscal policy. The natural equilibrium that would arise out of having different people controlling different instruments and focusing on different objectives is, in general, not anywhere near what is optimal in achieving overall societal objectives. Better coordination – and the use of more instruments – can, for instance, enhance economic stability.

Joseph Stiglitz

K. in memoriam (private)

16 May, 2013 at 05:39 | Posted in Varia | Comments Off on K. in memoriam (private)

Today, exactly twenty years ago, the unimaginable happened.

Some people say time heals all wounds. I know that’s not true. Some wounds never heal. You just learn to live with the scars.

No blogging today.

Top Economics Blogs

15 May, 2013 at 19:13 | Posted in Varia | 1 Comment

Objectively, blogs are subjective, so coming up with a list of the top 10, top 15, or top 100 economics blogs is no easy undertaking. Economics bloggers vary widely from individual students and professors sharing their thoughts on current events, new research or the state of the profession to the blogging superstars like Greg Mankiw, Paul Krugman and Tyler Cowen.

Instead of trying to rank the blogs, we are simply going to list some of our favourites. These are the blogs to which we turn when looking for interesting, informative, and offbeat articles to share. All of these blogs provide some insight into the economics profession and we at INOMICS enjoy going through them and sharing the most interesting articles each day with our readers, especially on Twitter.

Aguanomics Evolving Economics
Angry Bear Ezra Klein’s Wonkblog
Askblog Felix Salmon
Becker-Posner Blog Freakonomics
Cafe Hayak Greg Mankiw’s Blog
Calculated Risk Lars P. Syll
Carpe Diem Macro and Other Market Musings
Cheap Talk Mainly Macro
Confessions of a Supply Side Liberal Marginal Revolution
Conversable Economist Market Design
Core Economics Modeled Behavior
Curious Cat Naked Capitalism
Don’t worry, I’m an economist NEP-HIS Blog
Econbrowser New Economic Perspectives
EconLog Noahpinion
Econometrics Beat Overcoming Bias
Economic Incentives Paul Krugman
Economic Logic Real Time Economics
Economist’s View  Real World Economics Review
Economists Do It With Models Steve Keen’s Debtwatch
Economix The Market Monetarist
Economonitor Thoughts on Economics
Econospeak Tim Harford
Ed Dolan’s Econ Blog Vox EU
  Worthwhile Canadian Initiative




15 May, 2013 at 13:59 | Posted in Economics, Theory of Science & Methodology | Comments Off on Grossman-Stiglitz-paradoxen

paradox4Inom informationsekonomin är Hayeks ”The Use of Knowledge in Society” (American Economic Review 1945) och Grossman & Stiglitz ”On the Impossibility of Informationally Efficient Markets” (American Economic Review 1980) två klassiker. Men medan Hayeks artikel ofta åberopas av nyösterrikiskt influerade ekonomer har neoklassiska nationalekonomer sällan något att säga om Grossman & Stiglitz:s artikel. Jag tror inte det är en tillfällighet

Ett av de mest avgörande antaganden som görs i den ortodoxa ekonomiska teorin är att ekonomins aktörer utan kostnad besitter fullständig information. Detta är ett antagande som heterodoxa ekonomer under lång tid ifrågasatt.

Neoklassiska ekonomer är självklart medvetna om att antagandet om perfekt information är orealistiskt i de flesta sammanhang. Man försvarar ändå användandet av det i sina formella modeller med att verkliga ekonomier, där informationen inte är fullt så perfekt, ändå inte skiljer sig på något avgörande sätt från de egna modellerna. Vad informationsekonomin visat är dock att de resultat som erhålls i modeller som bygger på perfekt information inte är robusta. Även en liten grad av informationsimperfektion har avgörande inverkan på ekonomins jämvikt. Detta har i och för sig också påvisats tidigare, av t ex transaktionskostnadsteorin. Dess representanter har dock snarare dragit slutsatsen att om man bara tar hänsyn till dessa kostnader i analysen så är standardresultaten i stort ändå intakta. Informationsekonomin visar på ett övertygande sätt att så inte är fallet. På område efter område har man kunnat visa att den ekonomiska analysen blir kraftigt missvisande om man bortser från asymmetrier i information och kostnader för att erhålla den. Bilden av marknader (och behovet av eventuella offentliga ingripanden) blir väsentligt annorlunda än i modeller som bygger på standardantagandet om fullständig information.

Grossman-Stiglitz-paradoxen visar att om marknaden vore effektiv – om priser fullt ut reflekterar tillgänglig information – skulle ingen aktör ha incitament att skaffa den information som priserna förutsätts bygga på. Om å andra sidan ingen aktör är informerad skulle det löna sig för en aktör att införskaffa information. Följaktligen kan marknadspriserna inte inkorporera all relevant information om de nyttigheter som byts på marknaden. Självklart är detta för (i regel marknadsapologetiska) neoklassiska nationalekonomer synnerligen störande. Därav ”tystnaden” kring artikeln och dess paradox!

Grossman & Stiglitz – precis som senare Frydman & Goldberg gör i Imperfect Knowledge Economics (Princeton University Press 2007) – utgår från och förhåller sig till Lucas et consortes. Deras värdering av det informationsparadigm som rationella förväntningar bygger på sammanfaller också så vitt jag kan bedöma helt med min egen. Det är ur relevans- och realismsynpunkt nonsens på styltor. Grossman-Stiglitz-paradoxen är kraftfull som ett yxhugg mot den neoklassiska roten. Det är därför den så gärna ”glöms” bort av neoklassiska ekonomer.

Hayek menade – se t ex kapitel 1 och 3 i Kunskap, konkurrens och rättvisa (Ratio 2003) – att marknader och dess prismekanismer bara har en avgörande roll att spela när information inte är kostnadsfri. Detta var en av huvudingredienserna i hans kritik av planhushållningsidén, då ju kostnadsfri information i princip skulle göra planhushållning och marknad likvärdiga (som så ofta är den nyösterrikiska bilden av marknader betydligt relevantare och mer realistisk än allehanda neoklassiska Bourbakikonstruktioner à la Debreu et consortes).

Kruxet med ”effektiva marknader” är – som Grossman & Stiglitz på ett lysande sätt visar – att de strikt teoretiskt bara kan föreligga när information är kostnadsfri. När information inte är gratis kan priset inte perfekt återspegla mängden tillgänglig information (nota bene – detta gäller vare sig asymmetrier föreligger eller ej).

Den i mitt tycke intressantaste funderingen utifrån Grossman & Stiglitz blir vad vi har för glädje av teorier som bygger på ”brusfria” marknader, när de inte bara är hopplöst orealistiska utan också visar sig vara teoretiskt inkonsistenta.

Trams är trams om än i vackra dosor.


12 May, 2013 at 19:54 | Posted in Varia | Comments Off on Lacrimosa


Van den Budenmayer

12 May, 2013 at 19:37 | Posted in Varia | Comments Off on Van den Budenmayer


Though I speak with the tongues of angels

11 May, 2013 at 23:15 | Posted in Varia | Comments Off on Though I speak with the tongues of angels


Though I speak with the tongues of angels,
If I have not love…
My words would resound with but a tinkling cymbal.
And though I have the gift of prophesy…
And understand all mysteries…
and all knowledge…
And though I have all faith
So that I could remove mountains,
If I have not love…
I am nothing.
Love is patient, full of goodness;
Love tolerates all things,
Aspires to all things,
Love never dies,
while the prophecies shall be done away,
tongues shall be silenced,
knowledge shall fade…
thus then shall linger only
faith, hope, and love…
but the greatest of these…
is love.

On tour

11 May, 2013 at 18:40 | Posted in Varia | Comments Off on On tour

Touring again. Conference in Stockholm and guest appearence in the parliament. Regular blogging will be resumed next week.


100% Wrong on 90%

10 May, 2013 at 16:42 | Posted in Economics | 1 Comment


(h/t Simsalablunder)

Ronald Coase – still making sense at 102 (!)

10 May, 2013 at 13:49 | Posted in Economics | Comments Off on Ronald Coase – still making sense at 102 (!)

coaseIn the 20th century, economics consolidated as a profession; economists could afford to write exclusively for one another. At the same time, the field experienced a paradigm shift, gradually identifying itself as a theoretical approach of economization and giving up the real-world economy as its subject matter.

But because it is no longer firmly grounded in systematic empirical investigation of the working of the economy, it is hardly up to the task … Today, a modern market economy with its ever-finer division of labor depends on a constantly expanding network of trade. It requires an intricate web of social institutions to coordinate the working of markets and firms across various boundaries. At a time when the modern economy is becoming increasingly institutions-intensive, the reduction of economics to price theory is troubling enough. It is suicidal for the field to slide into a hard science of choice, ignoring the influences of society, history, culture, and politics on the working of the economy.

Ronald Coase, Saving Economics from the Economists

Reinhart-Rogoff och EU:s normpolitiska åtstramning

10 May, 2013 at 10:44 | Posted in Economics, Politics & Society | 1 Comment

Det brukar kallas normpolitik, föreställningen att det finns tumregler som hjälper finansministrar och centralbankschefer att fatta rätt beslut. Ekonomi är ju, när allt kommer omkring, för svårt och komplicerat för vanliga politiker, för att inte tala om okunniga medborgare. Bättre då att låta normer automatiskt fatta de rätta, de nödvändiga besluten. Regler är dessutom okänsliga för opinion, till skillnad från politiker som alltför ofta försöker infria de vallöften de har gett sina väljare.

hermeleJag skämtar, men ämnet är allvarligt: under de senaste tjugo åren har normer och regler kommit att dominera den faktiskt förda ekonomiska politiken. Ett av de tydligaste tecknen är EU:s Maastrichtfördrag från 1992 som lade grunden för den gemensamma valutan, euron. För att få komma med i euron måste länder uppfylla så kallade konvergenskriterier: inflationen måste vara under 2 procent, budgetunderskottet inte större än 3 procent av bnp, och statsskulden ska inte överstiga 60 procent av bnp. Lägg märke till att ingenting sägs om hur hög arbetslösheten får vara …

Det är i det sammanhanget man kan förstå uppmärksamheten kring en kort artikel av ekonomerna Carmen Reinhart och Kenneth Rogoff – ”Growth in a Time of Debt”, publicerad i American Economic Review 2010. Reinhart och Rogoff hävdar att gränsen för en ansvarsfull politik går vid en historiskt fastställd ”skuldtröskel” om 90 procent av bnp …

För några veckor sedan avslöjade dock en granskning genomförd av Thomas Herndon vid Massachusetts universitet i Amherst att Reinhart och Rogoffs ”skuldtröskel” bygger på två fel: dels har de glömt att ta med flera länder som ingick i deras databas när de sammanfattar sin statistik – ett misstag som författarna erkänt – dels hade de räknat varje tillväxt eller nedgång lika oavsett hur länge den varat … Det leder hela resultatet i konservativ riktning, mot lägre skuldsättning och mer åtstramning …

Det allvarliga med Reinhart och Rogoffs felaktiga artikel är inte att den skulle leda till omedelbara drakoniska åtstramningar över hela linjen.

Nej, det verkligt oroande i deras nu avklädda räkneövningar är att de rättfärdigar normpolitik, som om det faktiskt skulle gå att fastställa vad som är en rimlig skuldsättning eller inflation en gång för alla, för alla länder, för alla tider. Därmed stöder Reinhart och Rogoff alla dem som vill minska valda församlingars och valda politikers handlingsutrymme. Normerna och tumreglerna visar sig alltså vara i högsta grad politiska, inte objektiva, vetenskapligt fastställda sanningar.

Det är något att lägga på minnet för de krisande euro-länderna.

Kenneth Hermele

Bob Pollin responds to Reinhart and Rogoff

9 May, 2013 at 09:27 | Posted in Economics | 1 Comment


(h/t Jan Milch)

Niall Ferguson – the peevish hole digger

8 May, 2013 at 22:27 | Posted in Varia | 2 Comments

dig a holeYou would think that a Harvard historian would know about the First Law of Holes: When in a hole, stop digging.

But Harvard historian Niall Ferguson dug his own hole of trouble a bit deeper, in “An Open Letter To The Harvard Community” posted at the Harvard Crimson’s website on Tuesday. In the letter, Ferguson apologizes profusely for recent dumb statements he made about the legendary economist John Maynard Keynes. In the process, Ferguson makes several more dumb statements.

In case you missed it, Ferguson last week declared that Keynes’ homosexuality had left him childless, making Keynes care nothing about the future and leading him to suggest that governments should spend their way out of economic downturns, which is why he is history’s greatest monster. Suck it, logic! At last conservatives had a Unified Theory Of Gay to explain all that has gone wrong with the world for the past 80 years or so.

Of course, most oxygen-breathing creatures immediately recoiled at the 100 or so varieties of stupid in Ferguson’s statement and reacted with fury and scorn. Like Ron Burgundy after he jumped into the Kodiak bear pit to save Veronica Corningstone, Ferguson immediately regretted his decision. In a statement on his website on Saturday, he offered an “Unqualified Apology,” admitting his comments were “doubly stupid” — not only do childless people care about the future, but Keynes’s wife had suffered a miscarriage, he pointed out. I would add that gay people can also have children, which makes Ferguson’s comments at least trebly stupid. But anyway, Ferguson’s apology was indeed appropriately unqualified.

But he just couldn’t shut up about it. He seems to have been baited into commenting further after Berkeley economist Brad DeLong and others noted that Ferguson had previously commented on Keynes’ sexuality, back in 1995. Ferguson’s “Open Letter” now addresses those claims. While purporting to be an apology, it is not unqualified at all. Instead, it turns into an exercise in peevishness and self-defensiveness.

Mark Gongloff

Niall Ferguson’s apology is an epic fail

8 May, 2013 at 13:21 | Posted in Varia | 17 Comments

fergusonFerguson’s “unreserved” apology is nothing of the kind. He does not apologize for his past efforts to smear Keynes. He tries to make it appear that the latest smear was a one-off, unthinking quip. It was neither. He apologizes for being “insensitive.” What could that mean in this context where he is supposedly agreeing that what he said was false – not true but “insensitive?” Ferguson simply made up the part about Keynes and “poetry.” Ferguson’s spreading of homophobic tropes isn’t “insensitive” in this context – it’s false and it is nasty.

Ferguson apologizes for forgetting that Keynes’ wife suffered a miscarriage. But what is the relevance of that fact to Ferguson’s smear or apology? Is he saying that the pregnancy falsifies his implicit smear that Keynes wasn’t “man enough” to have sex with a woman? Did he think gay or bisexual males were sterile or impotent? Why did he emphasize his claim that Keynes married “a ballerina?”

Why didn’t Ferguson apologize for his substantive misstatements? As a historian who has read Keynes he knows that Keynes’ quip about “in the long run we are all dead” had absolutely nothing to do with claiming that the longer-term health of the economy was unimportant or a matter in which Keynes was uninterested.

William K. Black

Added 8/5: And as if this wasn’t enough, Ferguson now has an article in The Harvard Crimson where he accuses his critics of being “among the most insidious enemies of academic freedom.” Read it yourself, but it’s in my view even more pathetic than his original statements. Who can take this guy seriously anymore? I for one certainly can’t.

Modern econometrics – a critical realist critique (wonkish)

7 May, 2013 at 14:47 | Posted in Statistics & Econometrics | 2 Comments

Neoclassical economists often hold the view that criticisms of econometrics are the conclusions of sadly misinformed and misguided people who dislike and do not understand much of it. This is really a gross misapprehension. To be careful and cautious is not the same as to dislike. keuzenkampAnd as any perusal of the mathematical-statistical and philosophical works of people like for example Nancy Cartwright, Chris Chatfield, Hugo Keuzenkamp, John Maynard Keynes or Tony Lawson would show, the critique is put forward by respected authorities. I would argue, against “common knowledge”, that they do not misunderstand the crucial issues at stake in the development of econometrics. Quite the contrary. They know them all too well – and are not satisfied with the validity and philosophical underpinning of the assumptions made for applying its methods.

Let me try to do justice to the critical arguments on the logic of probabilistic induction and shortly elaborate – mostly from a philosophy of science vantage point – on some insights critical realism gives us on econometrics and its methodological foundations.

The methodological difference between an empiricist and a deductivist approach can also clearly be seen in econometrics. The ordinary deductivist “textbook approach” views the modeling process as foremost an estimation problem, since one (at least implicitly) assumes that the model provided by economic theory is a well-specified and “true” model. The more empiricist, general-to-specific-methodology (often identified as “the LSE approach”) on the other hand views models as theoretically and empirically adequate representations (approximations) of a data generating process (DGP). Diagnostics tests (mostly some variant of the F-test) are used to ensure that the models are “true” – or at least “congruent” – representations of the DGP. The modeling process is here more seen as a specification problem where poor diagnostics results may indicate a possible misspecification requiring re-specification of the model. The objective is standardly to identify models that are structurally stable and valid across a large time-space horizon. The DGP is not seen as something we already know, but rather something we discover in the process of modeling it. Considerable effort is put into testing to what extent the models are structurally stable and generalizable over space and time.

Although I have sympathy for this approach in general, there are still some unsolved “problematics” with its epistemological and ontological presuppositions. There is, e. g., an implicit assumption that the DGP fundamentally has an invariant property and that models that are structurally unstable just have not been able to get hold of that invariance. But, as already Keynes maintained, one cannot just presuppose or take for granted that kind of invariance. It has to be argued and justified. Grounds have to be given for viewing reality as satisfying conditions of model-closure. It is as if the lack of closure that shows up in the form of structurally unstable models somehow could be solved by searching for more autonomous and invariable “atomic uniformity”. But if reality is “congruent” to this analytical prerequisite has to be argued for, and not simply taken for granted.

Even granted that closures come in degrees, we should not compromise on ontology. Some methods simply introduce improper closures, closures that make the disjuncture between models and real world target systems inappropriately large. “Garbage in, garbage out.”

Underlying the search for these immutable “fundamentals” lays the implicit view of the world as consisting of material entities with their own separate and invariable effects. These entities are thought of as being able to be treated as separate and addible causes, thereby making it possible to infer complex interaction from knowledge of individual constituents with limited independent variety. But, again, if this is a justified analytical procedure cannot be answered without confronting it with the nature of the objects the models are supposed to describe, explain or predict. Keynes himself thought it generally inappropriate to apply the “atomic hypothesis” to such an open and “organic entity” as the real world. As far as I can see these are still appropriate strictures all econometric approaches have to face. Grounds for believing otherwise have to be provided by the econometricians.

haavelmoTrygve Haavelmo, the “father” of modern probabilistic econometrics, wrote that he and other econometricians could not “build a complete bridge between our models and reality” by logical operations alone, but finally had to make “a non-logical jump” [1943:15]. A part of that jump consisted in that econometricians “like to believe … that the various a priori possible sequences would somehow cluster around some typical time shapes, which if we knew them, could be used for prediction” [1943:16]. But since we do not know the true distribution, one has to look for the mechanisms (processes) that “might rule the data” and that hopefully persist so that predictions may be made. Of possible hypothesis on different time sequences (“samples” in Haavelmo’s somewhat idiosyncratic vocabulary)) most had to be ruled out a priori “by economic theory”, although “one shall always remain in doubt as to the possibility of some … outside hypothesis being the true one” [1943:18].

To Haavelmo and his modern followers, econometrics is not really in the truth business. The explanations we can give of economic relations and structures based on econometric models are “not hidden truths to be discovered” but rather our own “artificial inventions”. Models are consequently perceived not as true representations of DGP, but rather instrumentally conceived “as if”-constructs. Their “intrinsic closure” is realized by searching for parameters showing “a great degree of invariance” or relative autonomy and the “extrinsic closure” by hoping that the “practically decisive” explanatory variables are relatively few, so that one may proceed “as if … natural limitations of the number of relevant factors exist” [Haavelmo 1944:29].

Haavelmo seems to believe that persistence and autonomy can only be found at the level of the individual, since individual agents are seen as the ultimate determinants of the variables in the economic system.

But why the “logically conceivable” really should turn out to be the case is difficult to see. At least if we are not satisfied by sheer hope. As we have already noted Keynes reacted against using unargued for and unjustified assumptions of complex structures in an open system being reducible to those of individuals. In real economies it is unlikely that we find many “autonomous” relations and events. And one could of course, with Keynes and from a critical realist point of view, also raise the objection that to invoke a probabilistic approach to econometrics presupposes, e. g., that we have to be able to describe the world in terms of risk rather than genuine uncertainty.

And that is exactly what Haavelmo [1944:48] does: “To make this a rational problem of statistical inference we have to start out by an axiom, postulating that every set of observable variables has associated with it one particular ‘true’, but unknown, probability law.”

But to use this “trick of our own” and just assign “a certain probability law to a system of observable variables”, however, cannot – just as little as hoping – build a firm bridge between model and reality. Treating phenomena as if they essentially were stochastic processes is not the same as showing that they essentially are stochastic processes. As Hicks [1979:120-21] so neatly puts it:

Things become more difficult when we turn to time-series … The econometrist, who works in that field, may claim that he is not treading on very shaky ground. But if one asks him what he is really doing, he will not find it easy, even here, to give a convincing answer … [H]e must be treating the observations known to him as a sample of a larger “population”; but what population? … I am bold enough to conclude, from these considerations that the usefulness of “statistical” or “stochastic” methods in economics is a good deal less than is now conventionally supposed. We have no business to turn to them automatically; we should always ask ourselves, before we apply them, whether they are appropriate to the problem in hand.”

And as if this wasn’t enough, one could also seriously wonder what kind of “populations” these statistical and econometric models ultimately are based on. Why should we as social scientists – and not as pure mathematicians working with formal-axiomatic systems without the urge to confront our models with real target systems – unquestioningly accept Haavelmo’s “infinite population”, Fisher’s “hypothetical infinite population”, von Mises’s “collective” or Gibbs’s ”ensemble”?

Of course one could treat our observational or experimantal data as random samples from real populations. I have no problem with that. But modern (probabilistic) econometrics does not content itself with that kind of populations. Instead it creates imaginary populations of “parallel universes” and assume that our data are random samples from that kind of populations.

But this is actually nothing else but handwaving! And it is inadequate for real science. As David Freedman writes in Statistical Models and Causal Inference:

With this approach, the investigator does not explicitly define a population that could in principle be studied, with unlimited resources of time and money. The investigator merely assumes that such a population exists in some ill-defined sense. And there is a further assumption, that the data set being analyzed can be treated as if it were based on a random sample from the assumed population. These are convenient fictions … Nevertheless, reliance on imaginary populations is widespread. Indeed regression models are commonly used to analyze convenience samples … The rhetoric of imaginary populations is seductive because it seems to free the investigator from the necessity of understanding how data were generated.

Econometricians should know better than to treat random variables, probabilites and expected values as anything else than things that strictly seen only pertain to statistical models. If they want us take the leap of faith from mathematics into the empirical world in applying the theory, they have to really argue an justify this leap by showing that those neat mathematical assumptions (that, to make things worse, often are left implicit, as e.g. independence and additivity) do not collide with the ugly reality. The set of mathematical assumptions is no validation in itself of the adequacy of the application.

Rigour and elegance in the analysis does not make up for the gap between reality and model. It is the distribution of the phenomena in itself and not its estimation that ought to be at the centre of the stage. A crucial ingredient to any economic theory that wants to use probabilistic models should be a convincing argument for the view that “there can be no harm in considering economic variables as stochastic variables” [Haavelmo 1943:13]. In most cases no such arguments are given.

Of course you are entitled – like Haavelmo and his modern probabilistic followers – to express a hope “at a metaphysical level” that there are invariant features of reality to uncover and that also show up at the empirical level of observations as some kind of regularities.

But is it a justifiable hope? I have serious doubts. The kind of regularities you may hope to find in society is not to be found in the domain of surface phenomena, but rather at the level of causal mechanisms, powers and capacities. Persistence and generality has to be looked out for at an underlying deep level. Most econometricians do not want to visit that playground. They are content with setting up theoretical models that give us correlations and eventually “mimic” existing causal properties.

We have to accept that reality has no “correct” representation in an economic or econometric model. There is no such thing as a “true” model that can capture an open, complex and contextual system in a set of equations with parameters stable over space and time, and exhibiting invariant regularities. To just “believe”, “hope” or “assume” that such a model possibly could exist is not enough. It has to be justified in relation to the ontological conditions of social reality.

In contrast to those who want to give up on (fallible, transient and transformable) “truth” as a relation between theory and reality and content themselves with “truth” as a relation between a model and a probability distribution, I think it is better to really scrutinize if this latter attitude is feasible. To abandon the quest for truth and replace it with sheer positivism would indeed be a sad fate of econometrics. It is more rewarding to stick to truth as a regulatory ideal and keep on searching for theories and models that in relevant and adequate ways express those parts of reality we want to describe and explain.

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that neither Haavelmo, nor the legions of probabilistic econometricians following in his footsteps, give supportive evidence for their considering it “fruitful to believe” in the possibility of treating unique economic data as the observable results of random drawings from an imaginary sampling of an imaginary population. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that econometrics on the whole has not delivered “truth”. And I doubt if it has ever been the intention of its main protagonists.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a more cautious attitude towards probabilistic inference of causality in economic contexts. Science should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts” [Keynes 1971-89 vol XVII:427].  We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour, since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes.

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

This is a more fundamental and radical problem than the celebrated “Lucas critique” have suggested. This is not the question if deep parameters, absent on the macro-level, exist in “tastes” and “technology” on the micro-level. It goes deeper. Real world social systems are not governed by stable causal mechanisms or capacities. It is the criticism that Keynes [1951(1926): 232-33] first launched against econometrics and inferential statistics already in the 1920s:

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of economic theoretical modeling – rather useless.


Freedman, David (2010), Statistical Models and Causal Inference. Cambridge: Cambridge University Press.

Haavelmo, Trygve (1943), Statistical testing of business-cycle theories. The Review of  Economics and Statistics 25:13-18.

– (1944), The probability approach in econometrics. Supplement to Econometrica 12:1-115.

Hicks, John (1979), Causality in Economics. London: Basil Blackwell.

Keynes, John Maynard (1951 (1926)), Essays in Biography. London: Rupert Hart-Davis.

– (1971-89) The Collected Writings of John Maynard Keynes, vol. I-XXX. D E Moggridge & E A G Robinson (eds), London: Macmillan.

Libertarian paradise

7 May, 2013 at 14:01 | Posted in Politics & Society | 6 Comments


Next Page »

Blog at
Entries and comments feeds.