Macroeconomic challenges

20 Feb, 2014 at 10:15 | Posted in Economics | 8 Comments

In discussing macroeconomics’ Faustian bargain, Simon [Wren-Lewis] asks:

“By putting all our macroeconomic model building eggs in one microfounded basket, have we significantly slowed down the pace at which macroeconomists can say something helpful about the rapidly changing real world?”

representative agentLet me deepen this question by pointing to five newish facts about the “real world” which any good, useful macro theory should be compatible with.

1. The unemployed are significantly less happy than those in work. This doesn’t merely provide the justification for an interest in macroeconomics. It also casts grave doubt upon RBC-style theories which unemployment is voluntary …

2. Price and wage stickiness is over-rated … Price stickiness isn’t universal …

3. The failure of a handful of organizations can have massive macroeconomic consequences … We need models in which micro failures generate macro ones …

4. Supply shocks do happen. It’s improbable that all productivity fluctuations are due merely to labour hoarding in the face of demand shocks …

5. Interactions between agents can magnify fluctuations. We know there are expenditure cascades, which occur because consumers copy other consumers …

These facts are a challenge to both RBC and New Keynesian models. But they have something in common. They stress the heterogeneity of agents … This, I fear, means that the problem with conventional macro isn’t so much its microfoundations per se as the assumption that these microfoundations must consist in representative agents.

Chris Dillow

Yes indeed, the assumption of representative agents is a critical one in modern macroeconomics — as is the insistence on microfoundations.

The purported strength of New Classical and “New Keynesian” macroeconomics is that they have firm anchorage in preference-based microeconomics, and especially the decisions taken by inter-temporal utility maximizing “forward-loooking” individuals.

To some of us, however, this has come at too high a price. The almost quasi-religious insistence that macroeconomics has to have microfoundations – without ever presenting neither ontological nor epistemological justifications for this claim – has put a blind eye to the weakness of the whole enterprise of trying to depict a complex economy based on an all-embracing representative actor equipped with superhuman knowledge, forecasting abilities and forward-looking rational expectations. It is as if – after having swallowed the sour grapes of the Sonnenschein-Mantel-Debreu-theorem – these economists want to resurrect the omniscient walrasian auctioneer in the form of all-knowing representative actors equipped with rational expectations and assumed to somehow know the true structure of our model of the world (how that could even be conceivable is beyond my imagination, given that the ongoing debate on microfoundations, if anything, shows that not even we, the economists, can come to agreement on a common model).

Microfoundations is thought to give macroeconomists the means to fully predetermine their models and come up with definitive, robust, stable, answers. In reality we know that the forecasts and expectations of individuals often differ systematically from what materialize in the aggregate, since knowledge is imperfect and uncertainty – rather than risk – rules the roost.

And microfoundations allegedly goes around the Lucas critique by focussing on “deep” structural, invariant parameters of optimizing individuals’ preferences and tastes. This is an empty hope without solid empirical or methodological foundation.

The kind of microfoundations that “New Keynesian” and New Classical general equilibrium macroeconomists are basing their models on, are not – at least from a realist point of view – plausible.

Without export certificates models and theories should be considered unsold. Unfortunately this understanding has not informed modern neoclassical economics, as can be seen by the profuse use of so called representative-agent models.

A common feature of modern neoclassical macroeconomics is to use simple (dynamic stochastic) general equilibrium models where representative actors are supposed to have complete knowledge, zero transaction costs and complete markets.

In these models, the actors are all identical. Of course, this has far-reaching analytical implications. Situations characterized by asymmetrical information – situations most of us consider to be innumerable – cannot arise in such models. If the aim is to build a macro-analysis from micro-foundations in this manner, the relevance of the procedure is highly questionable (Robert Solow has even considered the claims made by protagonists of rational agent models “generally phony”).

One obvious critique is that representative-agent models do not incorporate distributional effects – effects that often play a decisive role in macroeconomic contexts. Investigations into the operations of markets and institutions usually find that there are overwhelming problems of coordination. These are difficult, not to say impossible, to analyze with the kind of Robinson Crusoe models that, e. g., Real Business Cycle theorists employ and which exclude precisely those differences between groups of actors that are the driving force in many non-neoclassical analysis.

The choices of different individuals have to be shown to be coordinated and consistent. This is obviously difficult if the macroeconomic models don’t give room for heterogeneous individuals (this lack of understanding the importance of heterogeneity is perhaps especially problematic for the modeling of real business cycles in dynamic stochastic general equilibrium models). Representative-agent models are certainly more manageable, however, from a realist point of view, they are also less relevant and have a lower explanatory potential.

Both the “Lucas critique” and Keynes’ critique of econometrics showed that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And as if this was not enough, there are obvious problems also with the kind of microeconomic equilibrium that one tries to reduce macroeconomics to. Decisions of consumption and production are described as choices made by a single agent. But then, who sets the prices on the market? And how do we justify the assumption of universal consistency between the choices?

Models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

And is it really possible to describe and analyze all the deliberations and choices made by individuals in an economy? Does not the choice of an individual presuppose knowledge and expectations about choices of other individuals? It probably does, and this presumably helps to explain why representative-agent models have become so popular in modern macroeconomic theory. They help to make the analysis more tractable.

One could justifiably argue that one might just as well accept that it is not possible to coherently reduce macro to micro, and accordingly that it is perhaps necessary to forswear microfoundations and the use of rational-agent models all together. Microeconomic reasoning has to build on macroeconomic presuppositions. Real individuals do not base their choices on operational general equilibrium models, but rather use simpler models. If macroeconomics needs microfoundations it is equally necessary that microeconomics needs macrofoundations.

The microeconomist Alan Kirman has maintained that the use of representative-agent models is unwarranted and leads to conclusions that are usually both misleading and false. It’s a fiction basically used by some macroeconomists to justify the use of equilibrium analysis and a kind of pseudo-microfoundations. Microeconomists are well aware that the conditions necessary to make aggregation to representative actors possible, are not met in actual economies. As economic models become increasingly complex, their use also becomes less credible.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, as we have already argued, even these microfoundations aren’t immutable. Lucas and the New Classical economists’ deep parameters – “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

In microfounded-rational-expectations-representative-agent macroeconomics the economy is  described “as if” consisting of one single agent – either by inflating the optimization problem of the individual to the scale of a whole economy, or by assuming that it’s possible to aggregate different individuals’ actions by a simple summation, since every type of actor is identical.

It would be better to just face the truth — it is impossible to describe interaction and cooperation when there is essentially only one actor.

8 Comments

  1. It would be nice to see at least once that you understand the absolute crucial distinction between a “representative agent” and one agent. The prisoners’ dilemma, for instance, is a representative agent model — yet, coordination problems is the issue at heart of the model.

  2. “Without export certificates models and theories should be considered unsold.”

    And how exactly should such export certificate look like in social sciences? Every theoretical model is unrealistic along some dimension, and no empirical study can claim 100% external validity. Of course, discussing realism and external validity of specific research addressing concrete question is a necessary and inseparable part of science. But nonspecific criticisms such as that above are simply vacuous and meaningless.

    • “Every theoretical model is unrealistic along some dimension, and no empirical study can claim 100% external validity.”

      And nonspecific blanket prophylactics like that are far more vacuous and meaningless.

  3. I took a derivative of (d)X
    Crossing Y to see what came next
    I knew my Representative agent would optimize
    since that is the way to economize
    however as a multiple of one that leaves me perplexed (he, he)

    • Tout simplement superbe 🙂

  4. Point (2) in Dillow’s post is quite misleading indeed. The report states that the main cause of changes in prices are raw material costs and wage costs — which is consistent with fixed-price theory.

    The point here is: although half of firms may change their prices in the face of changes in demand it is by no means clear that they change the prices to clear markets — this is the key to vindicating flexible price arguments.

    Also, so far as I can see (I didn’t read the whole report), the report does not take into account the SIZE of the firms that respond to changes in demand quickly. If 50% of firms do respond quickly but these 50% are small firms — which logic tells us they would be — then they may make up maybe 10% of GDP.

    Again, I think what you said in point (2) is quite misleading.

  5. “A common feature of modern neoclassical macroeconomics is to use simple (dynamic stochastic) general equilibrium models where representative actors are supposed to have complete knowledge, zero transaction costs and complete markets.

    In these models, the actors are all identical. ”

    Are you sure about this? Because there are such models with heterogeneous agents.

    More importantly, isn’t a representative agent an aggregation of heterogeneous agents?

    In other words, it seems that the main dispute here is whether the aggregate is a good representative of the set of heterogeneous agents. But in macro, aren’t we only interested in aggregates? Thus the issue is whether we are aggregating correctly, rather than the aggregation itself.

    So the failure is probably not related to the representative agent but to simplicity of the models. Probably the models must become more complicated and uncover all the hidden modes the current models cannot uncover. Anyone who studied non-linear dynamics knows that there are frequently hidden modes in nature whose discovery requires either a more complex model or a better numerical solution method.

    Just my 2/1000 bitcoins

  6. The distributional issues are a big problem, in particular the distribution between profits and wages. Once we realise that all business sector income is ultimately either returned to the household sector via wages or dividends, we realise that there are no financial constraints for the household sector in these models. The true solution to the optimising problem is invariant to relative prices or wages. However, since the true solutions are not being computed, only linearisations, the fact that they are not correctly characterising solutions is not picked up.


Sorry, the comment form is closed at this time.

Blog at WordPress.com.
Entries and comments feeds.