The Lucas critique comes back with a vengeance in DSGE models

25 Apr, 2018 at 10:08 | Posted in Economics, Statistics & Econometrics | 3 Comments

Both approaches to DSGE macroeconometrics (VAR and Bayesian) have evident vulnerabilities, which substantially derive from how parameters are handled in the technique. In brief, parameters from formally elegant models are calibrated in order to obtain simulated values that reproduce some stylized fact and/or some empirical data distribution, thus relating the underlying theoretical model and the observational data. But there are at least three main respects in which this practice fails.

lucasFirst of all, DSGE models have substantial difficulties in taking account of many important mechanisms that actually govern real economies, for example, institutional constraints like the tax system, thereby reducing DSGE power in policy analysis … In the attempt to deal with this serious problem, various parameter constraints on the model policy block are provided. They derive from institutional analysis and reflect policymakers’ operational procedures. However such model extensions, which are intended to reshape its predictions to reality and to deal with the underlying optimization problem, prove to be highly unflexible, turning DSGE into a “straitjacket tool” … In particular, the structure imposed on DSGE parameters entails various identification problems, such as observational equivalence, underidentification, and partial and weak identification.

These problems affect both empirical DSGE approaches. Fundamentally, they are ascribable to the likelihoods to estimate. In fact, the range of structural parameters that generate impulse response functions and data distributions fitting very close to the true ones does include model specifications that show very different features and welfare properties. So which is the right model specification (i.e., parameter set) to choose? As a consequence, reasonable estimates do not derive from the informative contents of models and data, but rather from the ancillary restrictions that are necessary to make the likelihoods informative, which are often arbitrary. Thus, after the Lucas’s super-exogeneity critique has been thrown out the door, it comes back through the window.

Roberto Marchionatti & Lisa Sella

Our admiration for technical virtuosity should not blind us to the fact that we have to have a cautious attitude towards probabilistic inferences in economic contexts. We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes. A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

This is a more fundamental and radical problem than the celebrated ‘Lucas critique’ have suggested. This is not the question if deep parameters, absent on the macro-level, exist in ‘tastes’ and ‘technology’ on the micro-level. It goes deeper. Real world social systems are not governed by stable causal mechanisms or capacities.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real-world social systems they mostly do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made ‘nomological machines’ they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics rather useless.

Both the ‘Lucas critique’ and the ‘Keynes’ critique’ of econometrics argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different ‘variables’ was not enough. If they could not get at the causal structure that generated the data, they were not really ‘identified’. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of posts — e. g. here and here — this, however, is a dead end.

3 Comments

  1. Marchionatti’s and Sella’s paper in earlier working paper form can be found at:
    .

    Click to access wp_21_2015%20%281%29.pdf

    .
    I don’t get how commentators can continue to refer to DSGE models as “Walrasian” or even “Neo-Walrasian”. DSGE models it seems to me are a kind of quasi Marshallian/partial equilibrium construction where an equilibrium is presumed to exist. The only part of the process that appears to have GE allusions is the optimization routine but it seems to me to be more like an exercise in mathematics than economics. It is as if applying the GE moniker adds cachet and credibility to the analysis.

  2. As a mathematician who comments on economics and read Keynes’ Treatise before his economics, Kingsley’s remarks seem right. I know it has been alleged that Keynes recanted from his Treatise but the evidence I have seen for this seems very thin, to say the least.

    I arrived at Keynes because through analysis, simulations and engagement with practitioners in various fields I arrived at the notion that social systems (among others) do tend to display something like ‘punctuated equilibria’, only to find that Keynes had beaten me to it by quite some time.

    In my view it may be in principal quite possible, most of the time, to extrapolate much as people do (although I might quibble about some details). The problem is in the use of the term ‘prediction’. It may be considered a prediction based on the assumption that we have fully understood the situation and that nothing fundamental changes. But this not to say that we might not be surprised from time, either by our ignorance or by innovation. Indeed, the ‘paradox of analysis’ is that you tell a policy maker that something bad will happen, they change their policy. And if you reassure them that things are just fine they often get complacent and things tend to go awry sooner or later. But while you cant predict what will actually happen, policy makers often find it useful to know within bounds and subject to various caveats which direction things are headed in. Keynes obviously had a much finer appreciation of these things, and there is much nuance one might add, but I see his views as broadly compatible with mine.

    I see much economic theory as somewhat like mathematics: The problem is not that it is wrong or not potentially useful, but that people forget that while some of it might (being charitable) be the best possible theory, it is still only theory and needs to be applied with care and understanding. (Which in economics would seem to imply a radical change to economic teaching: I agree with that!)

  3. “economic regularities…are rare, or even non-existant”
    .
    Keynes disagreed with this. The cornerstone assumption of his “General Theory” was that the marginal propensity to consume is fairly stable, positive and less than unity.
    Chapter 8, Part I: “Psychological characteristics of human nature and social practices and institutions … are unlikely to undergo a material change over a short period of time except in abnormal or revolutionary circumstances.”
    Chapter 8, Part III: “There is “a fundamental psychological law, upon which we are entitled to depend with great confidence both a priori from our knowledge of human nature and from the detailed facts of experience, that men are disposed, as a rule and on the average, to increase their consumption as their income increases, but not by as much as the increase in their income”.
    Chapter 18, Part III (i): “Experience shows that some such psychological law must actually hold…[otherwise] … there would be a violent instability in the price-level”.
    .
    Chapter 10, Part V: “It should not be difficult to compile a chart of the marginal propensity to consume at each stage of a trade cycle from the statistics if they were available) of aggregate income and aggregate investment at successive dates.” Using “precarious” national accounts data for the USA in the 1930s, disregarding some contrary computations, and averaging over a few carefully selected years, Keynes estimated that the multiplier was “probably fairly stable in the neighbourhood of 2.5”. The adjective “probably” indicates that there is no fundamental uncertainty about this matter.


Sorry, the comment form is closed at this time.

Blog at WordPress.com.
Entries and Comments feeds.