DSGE models are missing the point

20 November, 2017 at 13:41 | Posted in Economics | 2 Comments

dsgeIn a recent attempt to defend DSGE modelling, Lawrence Christiano, Martin Eichenbaum and Mathias Trabandt have to admit that DSGE models have failed to predict financial crises. The reason they put forward for this is that the models did not “integrate the shadow banking system into their analysis.” That certainly is true — but the DSGE problems go much deeper than that:

A typical modern approach to writing a paper in DSGE macroeconomics is as follows:

o to establish “stylized facts” about the quantitative interrelationships of certain macroeconomic variables (e.g. moments of the data such as variances, autocorrelations, covariances, …) that have hitherto not been jointly explained;

o to write down a DSGE model of an economy subject to a defined set of shocks that aims to capture the described interrelationships; and

o to show that the model can “replicate” or “match” the chosen moments when it is fed with stochastic shocks generated by the assumed shock process …

reality-check_600_441_80However, the test imposed by matching DSGE models to the data is problematic in at least three respects:

First, the set of moments chosen to evaluate the model is largely arbitrary …

Second, for a given set of moments, there is no well-defined statistic to measure the goodness of fit of a DSGE model or to establish what constitutes an improvement in such a framework …

Third, the evaluation is complicated by the fact that, at some level, all economic models are rejected by the data … In addition, DSGE models frequently impose a number of restrictions that are in direct conflict with micro evidence. If a model has been rejected along some dimensions, then a statistic that measures the goodness-of-fit along other dimensions is meaningless …

Focusing on the quantitative fit of models also creates powerful incentives for researchers (i) to introduce elements that bear little resemblance to reality for the sake of achieving a better fit (ii) to introduce opaque elements that provide the researcher with free (or almost free) parameters and (iii) to introduce elements that improve the fit for the reported moments but deteriorate the fit along other unreported dimensions.

Albert Einstein observed that “not everything that counts can be counted, and not everything that can be counted counts.” DSGE models make it easy to offer a wealth of numerical results by following a well-defined set of methods (that requires one or two years of investment in graduate school, but is relatively straightforward to apply thereafter). There is a risk for researchers to focus too much on numerical predictions of questionable reliability and relevance that absorb a lot of time and effort rather than focusing on deeper conceptual questions that are of higher relevance for society.

Anton Korinek

Great essay, showing that ‘rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

To reply to Korinek’s devastating critique — as do Christiano et al.  — with pie-in-the-sky formulations such as ‘young cutting-edge researchers having promising extensions of the model in the pipeline’ or claiming that “there is no credible alternative,” cannot be the right scientific attitude. No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, DSGE models do not push economic science forwards one single millimetre if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not say anything about real-world economies.

Proving things ‘rigorously’ in DSGE models is at most a starting point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Mainstream economists think there is a gain from the DSGE style of modelling in its capacity to offer some kind of structure around which to organise discussions. To me, that sounds more like a religious theoretical-methodological dogma, where one paradigm rules in divine hegemony. That’s not progress. That’s the death of economics as a science.

As Korinek argues, using DSGE models “creates a bias towards models that have a well-behaved ergodic steady state.” Since we know that most real-world processes do not follow an ergodic distribution, this is, to say the least, problematic. To understand real world ‘non-routine’ decisions and unforeseeable changes in behaviour, stationary probability distributions are of no avail. In a world full of genuine uncertainty — where real historical time rules the roost — the probabilities that ruled the past are not those that will rule the future. Imposing invalid probabilistic assumptions on the data make all DSGE models statistically misspecified.

Advocates of DSGE modelling want to have deductively automated answers to fundamental causal questions. But to apply ‘thin’ methods we have to have ‘thick’ background knowledge of what’s going on in the real world, and not in idealized models. Conclusions can only be as certain as their premises — and that also applies to the quest for causality and forecasting predictability in DSGE models.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized that have to match reality, not the other way around. The modelling convention used when constructing DSGE models makes it impossible to fully incorporate things that we know are of paramount importance for understanding modern economies — such as income and wealth inequality, asymmetrical power relations and information, liquidity preference, just to mention a few.

Given all these fundamental problems for the use of these models and their underlying methodology, it is beyond understanding how the DSGE approach has come to be the standard approach in ‘modern’ macroeconomics. DSGE models are based on assumptions profoundly at odds with what we know about real-world economies. That also makes them little more than overconfident story-telling devoid of real scientific value. Macroeconomics would do much better with more substantive diversity and plurality.


Pickwickian economics

19 November, 2017 at 11:23 | Posted in Economics | 1 Comment

pickwick_0.featureMill provides a good illustration of the tension between fallibilism and anti-foundationalism​. Mill’s first principles are supposed to be empirical and not necessary truths, but for economics to be an empirical subject at all, they have to be beyond genuine doubt, since they provide the only empirical element in an otherwise deductive system. The certainty that Mill claims for the results of scientific economics are purchased with deep uncertainty about the significance of those results – in particular, how important economic outcomes are relative to countervailing noneconomic outcomes. And the modern economist or philosopher surely would regard Mill’s economics as empirical only in a Pickwickian sense, as Mill does not leave open the possibility that anything could count as evidence against its first principles.

Kevin Hoover

Since modern mainstream economists do not even bother to argue for their foundational assumptions being neither necessary nor “beyond genuine doubt,” mainstream economics is arguably even more Pickwickian than John Stuart Mill’s methodological ruminations …

Why Krugman and Stiglitz are no real alternatives to mainstream economics

17 November, 2017 at 20:38 | Posted in Economics | 8 Comments

verso_978-1-781683026_never_let_a_serious_crisis__pb_edition__large_300_cmyk-dc185356d27351d710223aefe6ffad0cLittle in the discipline has changed in the wake of the crisis. Mirowski thinks that this is at least in part a result of the impotence of the loyal opposition — those economists such as Joseph Stiglitz or Paul Krugman who attempt to oppose the more viciously neoliberal articulations of economic theory from within the camp of neoclassical economics. Though Krugman and Stiglitz have attacked concepts like the efficient markets hypothesis … Mirowski argues that their attempt to do so while retaining the basic theoretical architecture of neoclassicism has rendered them doubly ineffective.

First, their adoption of the battery of assumptions that accompany most neoclassical theorizing — about representative agents, treating information like any other commodity, and so on — make it nearly impossible to conclusively rebut arguments like the efficient markets hypothesis. Instead, they end up tinkering with it, introducing a nuance here or a qualification there … Stiglitz’s and Krugman’s arguments, while receiving circulation through the popular press, utterly fail to transform the discipline.

Paul Heideman

Despite all their radical rhetoric, Krugman and Stiglitz are — where it really counts — nothing but die-hard mainstream neoclassical economists. Just like Milton Friedman, Robert Lucas or Greg Mankiw.

The only economic analysis that Krugman and Stiglitz  — like other mainstream economists — accept is the one that takes place within the analytic-formalistic modelling strategy that makes up the core of mainstream economics. All models and theories that do not live up to the precepts of the mainstream methodological canon are pruned. You’re free to take your models — not using (mathematical) models at all is considered totally unthinkable —  and apply them to whatever you want — as long as you do it within the mainstream approach and its modelling strategy. If you do not follow this particular mathematical-deductive analytical formalism you’re not even considered doing economics. ‘If it isn’t modelled, it isn’t economics.’

straight-jacketThat isn’t pluralism.

That’s a methodological reductionist straightjacket.

So, even though we have seen a proliferation of models, it has almost exclusively taken place as a kind of axiomatic variation within the standard ‘urmodel’, which is always used as a self-evident bench-mark.

Krugman and Stiglitz want to purvey the view that the proliferation of economic models during the last twenty-thirty years is a sign of great diversity and abundance of new ideas.

But, again, it’s not, really, that simple.

Although mainstream economists like to portray mainstream economics as an open and pluralistic ‘let a hundred flowers bloom,’ in reality it is rather ‘plus ça change, plus c’est la même chose.’

Applying closed analytical-formalist-mathematical-deductivist-axiomatic models, built on atomistic-reductionist assumptions to a world assumed to consist of atomistic-isolated entities, is a sure recipe for failure when the real world is known to be an open system where complex and relational structures and agents interact. Validly deducing things in models of that kind doesn’t much help us understanding or explaining what is taking place in the real world we happen to live in. Validly deducing things from patently unreal assumptions — that we all know are purely fictional — makes most of the modelling exercises pursued by mainstream economists rather pointless. It’s simply not the stuff that real understanding and explanation in science is made of. Just telling us that the plethora of mathematical models that make up modern economics  “expand the range of the discipline’s insights” is nothing short of hand waving.

No matter how many thousands of technical working papers or models mainstream economists come up with, as long as they are just ‘wildly inconsistent’ axiomatic variations of the same old mathematical-deductive ilk, they will not take us one single inch closer to giving us relevant and usable means to further our understanding and possible explanations of real economies.

Getting the rabbit into the neoclassical hat

15 November, 2017 at 21:30 | Posted in Economics | Comments Off on Getting the rabbit into the neoclassical hat

In public, including in the training of economists, Neoclassical economics usually reads its models backwards. This gives the illusion that they show the behaviour of individual economic units determining sets of equilibrium values for markets and for whole economies. It hides the fact that these models have been constructed not by investigating the behaviour of individual agents, but rather by analysing the requirements of achieving a certain macro state, that is, a market or general equilibrium. It is the behaviour found to be logically consistent with these hypothetical macro states that is prescribed for the individual agents, rather than the other way around.fullThis macro-led analysis, this derivation of the micro from a macro assumption, is and always has been the standard analytical procedure of theory construction for the Neoclassical narrative. Sometimes, for pedagogical reasons, authors call attention to how the “individualist” rabbit really gets into the Neoclassical hat. For example, consider the following passage from a once widely used introduction to economics.
“For the purpose of our theory, we want the preference ranking to have certain properties, which give it a particular, useful structure. We build these properties up by making a number of assumptions, first about the preference-indifference relation itself, and then about some aspects of the preference ranking to which it gives rise” (Gravell and Rees 1981, p. 56, emphasis added).
In other words, it is not the behaviour of the individual agents that determines the model’s overall structure, nor even the structure of the preference ranking. Instead it is the macro requirement for a particular structure which dictates the behaviour attributed to the individual agents. The “purpose” of this “particular, useful structure” is to rationalize the macro “conclusion” assumed at the beginning of the exercise. The resulting model shows micro phenomena determining macro phenomena, whereas, in fact, it is the starting point of the macro structure that has determined the behaviour of the model’s micro elements. Likewise, “rationality” becomes something defined to meet the exigencies of a desired conclusion.

Yes, indeed, ‘rationality’ in mainstream neoclassical economics is defined and used​ in rather suspect ways. Take, for example, the rational expectations assumption. Rational expectations in the mainstream economists’ world imply​ that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality, ​it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds ​since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis, ​we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in mainstream macro​ models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

‘Rigorous’ and ‘precise’ mainstream neoclassical models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimetre​ if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not say anything about real-world​ economies.

On Econs and Humans

13 November, 2017 at 18:10 | Posted in Economics | 3 Comments

nudgeMany years ago, Thaler was hosting dinner for some guests (other then-young economists) and put out a large bowl of cashew nuts to nibble on with the first bottle of wine. Within a few minutes it became clear that the bowl of nuts was going to be consumed in its entirety, and that the guests might lack sufficient appetite to enjoy all the food that was to follow. Leaping into action, Thaler grabbed the bowl of nuts, and (while sneaking a few more nuts for himself removed the bowl to the kitchen, where it was put out of sight.

When he returned, the guests thanked him for removing the nuts. The conversation immediately turned to the theoretical question of how they could possibly be happy about the fact that there was no longer a bowl of nuts in front of them … In economics (and in ordinary life), a basic principle is that you can never be made worse off by having more options, because you can always turn them down. Before Thaler removed the nuts the group had the choice of whether to eat the nuts or not – now they didn’t. In the land of Econs, it is against the law to be happy about this!

The atomic hypothesis and the limits of econometrics

11 November, 2017 at 16:36 | Posted in Economics | 1 Comment

4388529Our admiration for technical virtuosity should never blind us to the fact that we have to have a cautious attitude towards probabilistic inferences in economic contexts. Science should help us disclose causal forces behind apparent ‘facts.’ We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive, not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes. A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

Real world social systems are seldom governed by stable causal mechanisms or capacities. As Keynes wrote in My early beliefs:

assumptionsThe atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of ‘laws’ and relations that econometrics has established, are laws and relations between entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real-world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existent. Unfortunately, that also makes most of the achievements of econometrics – as most of the contemporary endeavours of mainstream economic theoretical modelling – rather useless.

Macroeconomics — religion or science?

10 November, 2017 at 16:51 | Posted in Economics | Comments Off on Macroeconomics — religion or science?

relMacroeconomists build theories codified by systems of equations. We use those equations to explain patterns in economic data. Unlike experimental sciences, chemistry and physics for example, macroeconomists cannot easily experiment. That does not mean that we cannot challenge existing theories, but it makes it much harder. Like astronomers waiting for the next supernova to explode; macroeconomists must wait for big recessions or large bouts of stagflation to help us sort one theory from another …

Macroeconomists can explain past data relatively well. But we are not very good at explaining new events and our theories are always evolving. In that sense, economics is a science. The way that our models are allowed to evolve is controlled by a group of high-priests who maintain doctrinal purity. In that sense, economics is a religion. The religious aspect is important during normal times, when we have not recently experienced a big event. At other times, after we observe an economic supernova, the grip of the high-priests becomes counterproductive and it is a fertile time to explore ideas that the priesthood considers heretical. Now is one of those times.

Roger Farmer

Big — and not so big — ideas in macroeconomics

9 November, 2017 at 22:44 | Posted in Economics | 1 Comment

In Athreya’s world, and that of a large part of the academic macroeconomics profession, macroeconomics does indeed begin with Walras, and the first modern development in the field was the formalization of Walras’ model by the economic theorists Arrow, Debreu and MacKenzie in the 1950s. The big subsequent development is the integration of growth theory into the static ADM framework to generate the modern dynamic stochastic general equilibrium (DSGE) models. Keynes’ 1936 ‘essay’ is treated as a curiosity, too vague and wordy to permit any real analysis.

big ideasThis has the odd effect that many of the leading Keynesians of the postwar era … are given respectful cites for their work on growth theory, even as … their macroeconomic work is dismissed as being too silly even to be refuted … Real macro (that is, Walrasian GE applied to issues like the business cycle) begins, in this analysis, with Robert Lucas in the late 1970s.

All this gives me a bit more insight into the apparent convergence in macroeconomics in the early years of this century, and its breakdown in 2008. The New Keynesians understood themselves as having met their New Classical colleagues halfway, with DSGE models which were Keynesian in character, at least in the short run, while meeting the demands for rigorous microeconomic foundations. Meanwhile, the New Classical school were quietly snickering whenever Keynes’ name was mentioned, but were prepared to concede the possible existence of largely unspecified market “imperfections”, whose only role in practice was to justify a policy of inflation targeting.

The crisis that erupted in 2008 destroyed this spurious consensus. On any kind of Keynesian view, New or Old, the combination of high unemployment and zero interest rates implied that the economy had been driven into a Keynesian liquidity trap, with a need for fiscal stimulus on a massive scale. By contrast, for the New Classicals, a disaster of this kind could only be the result of government failure (or, in places where they still mattered, the pernicious actions of trade unions). Since this was implausible, New Classical economists have generally preferred to reassert dogma without too much attention to facts.

Broadly speaking, as far as academic macroeconomics is concerned, DSGE has won the day, not so much by force of argument as by maintaining control of the criteria for publication of journal articles in the field: it’s OK to assume full employment, and ignore inflation, but not to omit rigorous microfoundations for your model …

The result is that there is almost zero intersection between Big Ideas in Macroeconomics and what I would think of as macroeconomics. It’s not so much that I think Athreya is wrong is that we are talking past each other. As Charles Goodhart said of DSGE, Athreya’s version of macro excludes everything in which I am interested.

John Quiggin

The ‘tiny little problem’ with Chicago economics

7 November, 2017 at 17:25 | Posted in Economics | 3 Comments

14-john-cochrane.w710.h473.2x Every dollar of increased government spending must correspond to one less dollar of private spending. Jobs created by stimulus spending are offset by jobs lost from the decline in private spending. We can build roads instead of factories, but fiscal stimulus can’t help us to build more of both. This form of “crowding out” is just accounting, and doesn’t rest on any perceptions or behavioral assumptions.

John Cochrane

And the tiny little problem? It’s utterly and completely wrong!

What Cochrane is reiterating here is nothing but Say’s law, basically saying that savings are equal to investments and that if the state increases investments, then private investments have to come down (‘crowding out’). As an accounting identity, there is, of course, nothing to say about the law, but as such, it is also totally uninteresting from an economic point of view. As some of my Swedish forerunners — Gunnar Myrdal and Erik Lindahl — stressed more than 80 years ago, it’s really a question of ex-ante and ex-post adjustments. And as further stressed by a famous English economist about the same time, what happens when ex-ante savings and investments differ, is that we basically get output adjustments. GDP changes and so makes saving and investments equal ex-post. And this, nota bene, says nothing at all about the success or failure of fiscal policies!

Government borrowing is supposed to “crowd out” private investment.

william-vickrey-1914-1996The current reality is that on the contrary, the expenditure of the borrowed funds (unlike the expenditure of tax revenues) will generate added disposable income, enhance the demand for the products of private industry, and make private investment more profitable. As long as there are plenty of idle resources lying around, and monetary authorities behave sensibly, (instead of trying to counter the supposedly inflationary effect of the deficit) those with a prospect for profitable investment can be enabled to obtain financing. Under these circumstances, each additional dollar of deficit will in the medium long run induce two or more additional dollars of private investment. The capital created is an increment to someone’s wealth and ipso facto someone’s saving. “Supply creates its own demand” fails as soon as some of the income generated by the supply is saved, but investment does create its own saving, and more. Any crowding out that may occur is the result, not of underlying economic reality, but of inappropriate restrictive reactions on the part of a monetary authority in response to the deficit.

William Vickrey

In a lecture on the US recession, Robert Lucas gave an outline of what the new classical school of macroeconomics today thinks on the latest downturns in the US economy and its future prospects.

lucLucas starts by showing that real US GDP has grown at an average yearly rate of 3 percent since 1870, with one big dip during the Depression of the 1930s and a big – but smaller – dip in the recent recession.

After stating his view that the US recession that started in 2008 was basically caused by a run for liquidity, Lucas then goes on to discuss the prospect of recovery from where the US economy is today, maintaining that past experience would suggest an “automatic” recovery, if the free market system is left to repair itself to equilibrium unimpeded by social welfare activities of the government.

As could be expected there is no room for any Keynesian type considerations on eventual shortages of aggregate demand discouraging the recovery of the economy. No, as usual in the new classical macroeconomic school’s explanations and prescriptions, the blame game points to the government and its lack of supply-side policies.

Lucas is convinced that what might arrest the recovery are higher taxes on the rich, greater government involvement in the medical sector and tougher regulations of the financial sector. But – if left to run its course unimpeded by European type welfare state activities -the free market will fix it all.

In a rather cavalier manner – without a hint of argument or presentation of empirical facts – Lucas dismisses even the possibility of a shortfall of demand. For someone who already 30 years ago proclaimed Keynesianism dead – “people don’t take Keynesian theorizing seriously anymore; the audience starts to whisper and giggle to one another” – this is of course only what could be expected. Demand considerations are simply ruled out on whimsical theoretical-ideological grounds, much like we have seen other neo-liberal economists do over and over again in their attempts to explain away the fact that the latest economic crises show how the markets have failed to deliver. If there is a problem with the economy, the true cause has to be the government.

Chicago economics is a dangerous pseudo-scientific zombie ideology that ultimately relies on the poor having to pay for the mistakes of the rich. Trying to explain business cycles in terms of rational expectations has failed blatantly. Maybe it would be asking too much of freshwater economists like Lucas and Cochrane to concede that, but it’s still a fact that ought to be embarrassing. My rational expectation is that 30 years from now, no one in economics will know who Robert Lucas or John Cochrane was. John Maynard Keynes, on the other hand, will still be known as one of the masters of economics.


4 November, 2017 at 22:33 | Posted in Economics | 5 Comments

Rumour has it that yours truly is celebrating his 60th birthday visiting​ his favourite city. Regular blogging to be resumed next week.

Next Page »

Blog at WordPress.com.
Entries and comments feeds.