The nodal point of the macroeconomics debate

16 Jul, 2012 at 14:28 | Posted in Economics, Theory of Science & Methodology | 3 Comments

This summer both Oxford professor Simon Wren-Lewis and Nobel laureate Paul Krugman have had interesting posts up discussing modern macroeconomics and its alleged needs of microfoundations.

Most “modern” mainstream neoclassical macroeonomists more or less subscribe to the view that microfoundations somehow has lead to better models enabling us to make better predictions of future macroeconomic events.

Both Wren-Lewis and Krugman are somewhat more sceptical vis-à-vis these expectations.

Wren-Lewis writes:

[S]uppose there is in fact more than one valid microfoundation for a particular aggregate model. In other words, there is not just one, but perhaps a variety of particular worlds which would lead to this set of aggregate macro relationships. (We could use an analogy, and say that these microfoundations were observationally equivalent in aggregate terms.) Furthermore, suppose that more than one of these particular worlds was a reasonable representation of reality. (Among this set of worlds, we cannot claim that one particular model represents the real world and the others do not.) It would seem to me that in this case the aggregate model derived from these different worlds has some utility beyond just one of these microfounded models. It is robust to alternative microfoundations.

In these circumstances, it would seem sensible to go straight to the aggregate model, and ignore microfoundations.

Paul Krugman is also doubtful of the value of microfoundations:

[W]hat we call “microfoundations” are not like physical laws. Heck, they’re not even true. Maximizing consumers are just a metaphor, possibly useful in making sense of behavior, but possibly not. The metaphors we use for microfoundations have no claim to be regarded as representing a higher order of truth than the ad hoc aggregate metaphors we use in IS-LM or whatever; in fact, we have much more supportive evidence for Keynesian macro than we do for standard micro.

Yours truly basically side with Wren-Lewis and Krugman on this issue, but I will try to explain why one might be even more critical and doubtful than they are re microfoundations of macroeconomics.

Microfoundations today means more than anything else that you try to build macroeconomic models assuming “rational expectations” and hyperrational “representative actors” optimizing over time. Both are highly questionable assumptions.

The concept of rational expectations was first developed by John Muth (1961) and later applied to macroeconomics by Robert Lucas (1972). Those macroeconomic models building on rational expectations microfoundations that are used today among both New Classical and “New Keynesian” macroconomists, basically assume that people on the average hold expectations that will be fulfilled. This makes the economist’s analysis enormously simplistic, since it means that the model used by the economist is the same as the one people use to make decisions and forecasts of the future.

Macroeconomic models building on rational expectations-microfoundations assume that people, on average, have the same expectations. Someone like Keynes for example, on the other hand, would argue that people often have different expectations and information, which constitutes the basic rational behind macroeconomic needs of coordination. Something that is rather swept under the rug by the extremely simple-mindedness of assuming rational expectations in representative actors models, which is so in vogue in New Classical and “New Keynesian” macroconomics. But if all actors are alike, why do they transact? Who do they transact with? The very reason for markets and exchange seems to slip away with the sister assumptions of representative actors and rational expectations.

Macroeconomic models building on rational expectations microfoundations impute beliefs to the agents that is not based on any real informational considerations, but simply stipulated to make the models mathematically-statistically tractable. Of course you can make assumptions based on tractability, but then you do also have to take into account the necessary trade-off in terms of the ability to make relevant and valid statements on the intended target system. Mathematical tractability cannot be the ultimate arbiter in science when it comes to modeling real world target systems. One could perhaps accept macroeconomic models building on rational expectations-microfoundations  if they had produced lots of verified predictions and good explanations. But they have done nothing of the kind. Therefore the burden of proof is on those macroeconomists who still want to use models built on these particular unreal assumptions.

In macroeconomic models building on rational expectations microfoundations –  where agents are assumed to have complete knowledge of all of the relevant probability distribution functions –  nothing really new happens, since they take for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution at all) that can be thought of taking place.

But in the real world, it is not possible to just assume that probability distributions are the right way to characterize, understand or explain acts and decisions made under uncertainty. When we simply do not know, when we have not got a clue, when genuine uncertainty prevail, macroeconomic models building on rational expectations-microfoundations simply will not do. In those circumstances it is not a useful assumption. The reason is that under those circumstances the future is not like the past, and henceforth, we cannot use the same probability distribution – if it at all exists – to describe both the past and future.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. We have to surpass macroeconomic models building on rational expectations-microfoundations and instead try to build economics on a more realistic foundation. A foundation that encompasses both risk and genuine uncertainty.

Macroeconomic models building on rational expectations microfoundations emanates from the belief that to be scientific, economics has to be able to model individuals and markets in a stochastic-deterministic way. It’s like treating individuals and markets as the celestial bodies studied by astronomers with the help of gravitational laws. Unfortunately, individuals, markets and entire economies are not planets moving in predetermined orbits in the sky.

To deliver macroeconomic models building on rational expectations microfoundations the economists have to constrain expectations on the individual and the aggregate level to be the same. If revisions of expectations take place they typically have to take in a known and prespecified precise way. This squares badly with what we know to be true in real world, where fully specified trajectories of future expectations revisions are no-existent.

Further, most macroeconomic models building on rational expectations microfoundations are time-invariant and so give no room for any changes in expectations and their revisions. The only imperfection of knowledge they admit of is included in the error terms, error terms that are assumed to be additive and to have a give and known frequency distribution, so that the models can still fully pre-specify the future even when incorporating these stochastic variables into the models.

In the real world there are many different expectations and these cannot be aggregated in macroeconomic models building on rational expectations microfoundations without giving rise to inconsistency. This is one of the main reasons for these models being modeled as representative actors models. But this is far from being a harmless approximation to reality. Even the smallest differences of expectations between agents would make these models inconsistent, so when they still show up they have to be considered “irrational”.

It is not possible to adequately represent individuals and markets as having one single overarching probability distribution. Accepting that, does not imply that we have to end all theoretical endeavours and assume that all agents always act totally irrationally and only are analyzable within behavioural economics. Far from it. It means we acknowledge diversity and imperfection, and that macroeconomics has to be able to incorporate these empirical facts in its models.

Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world). A model that has neither surface nor deep resemblance to important characteristics of real economies ought to be treated with prima facie suspicion. How could we possibly learn about the real world if there are no parts or aspects of the model that have relevant and important counterparts in the real world target system? The burden of proof lays on the macroeconomists thinking they have contributed anything of scientific relevance without even hinting at any bridge enabling us to traverse from model to reality. All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built assumptions made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is (no longer) the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But being able to model a world that somehow could be considered real or similar to the real world is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

The microfounded macromodel should enable us to posit contrafactual questions about what would happen if some variable was to change in a specific way (hence the assumption of structural invariance, that purportedly enables the theoretical economist to do just that). But does it? Applying a “Lucas critique” on most microfounded macromodels, it is obvious that they fail. Changing “policy rules” cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy.

Without export certificates models and theories should be considered unsold. Unfortunately this understanding has not informed modern neoclassical economics, as can be seen by the profuse use of so called representative agent models.

A common feature of modern neoclassical macroeconomics is to use simple (dynamic stochastic) general equilibrium models where representative actors are supposed to have complete knowledge, zero transaction costs and complete markets.

In these models, the actors are all identical. Of course, this has far-reaching analytical implications. Situations characterized by asymmetrical information – situations most of us consider to be innumerable – cannot arise in such models. If the aim is to build a macro-analysis from micro-foundations in this manner, the relevance of the procedure is highly questionable. Robert Solow has even considered the claims made by protagonists of rational agent models “generally phony”.

One obvious critique is that representative agent models do not incorporate distributional effects – effects that often play a decisive role in macroeconomic contexts. Investigations into the operations of markets and institutions usually find that there are overwhelming problems of coordination. These are difficult, not to say impossible, to analyze with the kind of Robinson Crusoe models that, e. g., real business cycle theorists employ and which exclude precisely those differences between groups of actors that are the driving force in many non-neoclassical analysis.

The choices of different individuals have to be shown to be coordinated and consistent. This is obviously difficult if the macroeconomic models don’t give room for heterogeneous individuals (this lack of understanding the importance of heterogeneity is perhaps especially problematic for the modeling of real business cycles in dynamic stochastic general equilibrium models). Representative agent models are certainly more manageable, however, from a realist point of view, they are also less relevant and have a lower explanatory potential. Or as Robert Gordon has it:

In the end, the problem with modern macro is that it contains too much micro and not enough macro. Individual representative agents assume complete and efficient markets and market clearing, while the models ignore the basic macro interactions implied by price stickiness, including macro externalities and coordination failures. In an economywide recession, most agents are not maximizing unconditional utility functions as in DSGE models but are maximizing, i.e., trying to make the best out of a bad situation, under biting income and liquidity constraints. Perceptive comments by others as cited above reject the relevance of modern macro to the current cycle of excess leveraging and subsequent deleveraging, because complete and efficient markets are assumed, and there is no room for default, bankruptcy, insolvency, and illiquidity.

Both the “Lucas critique” and Keynes’ critique of econometrics argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And as if this was not enough, there are obvious problems also with the kind of microeconomic equilibrium that one tries to reduce macroeconomics to. Decisions of consumption and production are described as choices made by a single agent. But then, who sets the prices on the market? And how do we justify the assumption of universal consistency between the choices?

Models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

And is it really possible to describe and analyze all the deliberations and choices made by individuals in an economy? Does not the choice of an individual presuppose knowledge and expectations about choices of other individuals? It probably does, and this presumably helps to explain why representative-agent models have become so popular in modern macroeconomic theory. They help to make the analysis more tractable.

One could justifiably argue that one might just as well accept that it is not possible to coherently reduce macro to micro, and accordingly that it is perhaps necessary to forswear microfoundations and the use of rational-agent models all together. Microeconomic reasoning has to build on macroeconomic presuppositions. Real individuals do not base their choices on operational general equilibrium models, but rather use simpler models. If macroeconomics needs microfoundations it is equally necessary that microeconomics needs macrofoundations.

The microeconomist Alan Kirman has maintained that the use of representative-agent models is unwarranted and leads to conclusions that are usually both misleading and false. It’s a fiction basically used by some macroeconomists to justify the use of equilibrium analysis and a kind of pseudo-microfoundations. Microeconomists are well aware that the conditions necessary to make aggregation to representative actors possible, are not met in actual economies. As economic models become increasingly complex, their use also becomes less credible.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, as we have already argued, even these microfoundations aren’t immutable. Lucas and the new classical economists’ deep parameters – “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

For Alfred Marshall economic theory was “an engine for the discovery of concrete truth”. But where Marshall tried to describe the behaviour of a typical business with the concept “representative firm”, his modern heirs don’t at all try to describe how firms interplay with other firms in an economy. The economy is rather described “as if” consisting of one single giant firm – either by inflating the optimization problem of the individual to the scale of a whole economy, or by assuming that it’s possible to aggregate different individuals’ actions by a simple summation, since every type of actor is identical. But do not we just have to face that it is difficult to describe interaction and cooperation when there is essentially only one actor?

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. But there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong – in the dustbin of history.

For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand waving that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

So – really – this is what the debate basically is all about. It’s not as Paul Krugman seems to mean – and although I’m a usually a big fan, honestly, it’s relly difficult to take him seriously here; it’s not even “brilliantly silly”, but just silly) a question of “gadgets”, “scratchpads” or any other “brillianty silly” toy that he or any other neoclassical economist chooses to play around with. That would be to trivialize economics and reduce it to a Glasperlenspiel.

Given that, I would say both Wren-Lewis and Krugman – especially if they really want to call themeselves Keynesians of any kind – ought to be even more critical of the microfoundationists than they are. If macroeconomic models – no matter of what ilk –  build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Incompatibility between actual behaviour and the behaviour in macroeconomic models building on representative actors and rational expectations microfoundations shows the futility of trying to represent real-world economis with models flagrantly at odds with reality.

In the conclusion to his book Models of Business Cycles (1987), Robert Lucas (in)famously wrote (p. 66 & 107-08):

It is remarkable and, I think, instructive fact that in nearly 50 years that Keynesian tradition has produced not one useful model of the individual unemployed worker, and no rationale for unemployment insurance beyond the observation that, in common with countercyclical cash grants to corporations or to anyone else, it has the effects of increasing the total volume of spending at the right times.  By dogmatically insisting that unemployment be classed as ‘involuntary’ this tradition simply cut itself off from serious thinking about the actual options unemployed people are faced with, and hence from learning anything about how the alternative social arrangements might improve these options.

The most interesting recent developments in macroeconomic theory seem to me describable as the reincorporation of aggregative problems such as inflation and the business cycle within the general framework of ‘microeconomic’ theory.  If these developments succeed, the term ‘macroeconomics’ will simply disappear from use and the modifier ‘micro’ will become superfluous.  We will simple speak, as did Smith, Ricardo, Marshall and Walras, of economic theory.  If we are honest, we will have to face the fact that at any given time there will be phenomena that are well-understood from the point of view of the economic theory we have, and other phenomena that are not.  We will be tempted, I am sure, to relieve the discomfort induced by discrepancies between theory and facts by saying the ill-understood facts are the province of some other, different kind of economic theory.  Keynesian ‘macroeconomics’ was, I think, a surrender (under great duress) to this temptation.  It led to the abandonment, for a class of problems of great importance, of the use of the only ‘engine for the discovery of truth’ that we have in economics.

Thanks to latter-day Lucasian New-Classical-New-Keynesian-Rational-Expectations-Representative-Agents-Microfoundations-economists, we are supposed not to – as our primitive ancestors – use that archaic term ‘macroeconomics’ anymore (with the possible exception of warning future economists not to give in to ‘discomfort.’)  Being intellectually heavily indebted to the man who invented macroeconomics – Keynes – yours truly firmly declines to concur.

Microfoundations – and a fortiori rational expectations and  representative agents – serve a particular theoretical purpose. And as the history of macroeconomics during the last thirty years has shown, this Lakatosian microfoundation programme for macroeconomics is only methodologically consistent within the framework of a (deterministic or stochastic) general equilibrium analysis. In no other context has it been possible to incorporate these kind of microfoundations, with its “forward-looking optimizing individuals,” into macroeconomic models.

This is of course not by accident. General equilibrium theory is basically nothing else than an endeavour to consistently generalize the microeconomics of individuals and firms on to the macroeconomic level of aggregates.

But it obviously doesn’t work. The analogy between microeconomic behaviour and macroeconomic behaviour is misplaced. Empirically, science-theoretically and methodologically, neoclassical microfoundations for macroeconomics are defective.  Tenable foundations for macroeconomics really have to be sought for elsewhere.

In an early post on the subject, Simon Wren-Lewis rhetorically asked:

Microfoundations – is there an alternative?

Of course there is an alternative to neoclassical general equilibrium microfoundations! Behavioural economics and Goldberg & Frydman’s “imperfect knowledge” economics being two noteworthy examples that easily come to mind.

And for those of us who have not forgotten the history of our discipline, and not bought the sweet-water nursery tale of Lucas et consortes that Keynes was not “serious thinking,” we can easily see that there exists a macroeconomic tradition inspired by Keynes (that has absolutely nothing to do with any New Synthesis or “New Keynesianism” to do).

Its ultimate building-block is the perception of genuine uncertainty and that people often “simply do not know.” Real actors can’t know everything and their acts and decisions are not simply possible to sum or aggregate without the economist risking to succumb to “the fallacy of composition”.

Instead of basing macroeconomics on unreal and unwarranted generalizations of microeconomic behaviour and relations, it is far better to accept the ontological fact that the future to a large extent is uncertain, and rather conduct macroeconomics on this fact of reality.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming uncertainty to be reducible to stochastic risk. That is scientific cheating. And it has been going on for too long now.

The Keynes-inspired building-blocks are there. But it is admittedly a long way to go before the whole construction is in place. But the sooner we are intellectually honest and ready to admit that “modern” neoclassical macroeconomics and its microfoundationalist programme has come to way’s end – the sooner we can redirect are aspirations and knowledge in more fruitful endeavours.

3 Comments

  1. Truly appreciate this blog and wish I had discovered it sooner. I got an insight into the “how many angels dance on the head of pin” nature of autistic neoclassical “macro” when I read Roger Farmer’s guest post on Noah Smith’s blog and shook my head in disbelief at the ensuing discussion: http://noahpinionblog.blogspot.ca/2012/04/equilibria-unique-and-not-so-unique.html
    I find the Lucasian world-view neither scientific nor rigourous when measured against applied sciences.

  2. It is interesting that krugman continues to insist that there are some important “meta” methodological reasons to use micro-foundation. It is a gadget but a useful one. Robert Gordon in his “Is Modern Macro or 1978-Macro More Relevant,” makes a persuasive case that its a silly gadget that has no utility at all!

    “The basic problem is that modern macro consists of too much micro and not enough macro. Focus on individual preferences and production functions misses the essence of macro fluctuations – the coordination failures and macro externalities that convert interactions among individual choices into constraints that prevent workers from optimizing hours of work and firms from optimizing sales, production, and utilization. Also modern business‐cycle macro has too narrow a view of the range of aggregate demand shocks that in the presence of sticky prices constrain the choices of workers and firms. Shocks that have little or nothing to do with technology, preferences, or monetary policy can interact and impose constraints on individual choices.”

    Moreover, Gordon addresses Wren’s question as to whether there is an alternative:

    “The achievement of 1978‐era macro was to retain the best of Keynesian demand‐side economics while dropping the negatively sloped inflation‐ unemployment tradeoff with its neglect of supply shocks. 1978‐era macro recognizes that the correlation between inflation and unemployment can be either negative or positive, just as microeconomics has long predicted that the correlation between the price and quantity of wheat can be either negative or positive depending on the size of the shocks to demand and supply. To understand the domestic macroeconomic environment of the 2007‐09 worldwide crisis, we are best served by applying 1978‐era macro.”

    He concludes his essay:

    “In the end, the problem with modern macro is that it contains too much micro and not enough macro. Individual representative agents assume complete and efficient markets and market clearing, while the models ignore the basic macro interactions implied by price stickiness, including macro externalities and coordination failures. In an economywide recession, most agents are not maximizing unconditional utility functions as in DSGE models but are maximizing, i.e., trying to make the best out of a bad situation, under biting income and liquidity constraints. Perceptive comments by others as cited above reject the relevance of modern macro to the current cycle of excess leveraging and subsequent deleveraging, because complete and efficient markets are assumed, and there is no room for default, bankruptcy, insolvency, and illiquidity.”

    • Thanks Dwayne. I’ve just read Gordon’s article (thanks to your earlier recommendation) and it’s great!


Sorry, the comment form is closed at this time.

Blog at WordPress.com.
Entries and Comments feeds.