Oxford macroeconomist Simon Wren-Lewis has a post up on his blog on the use of labels in macroeconomics:
Labels are fun, and get attention. They can be a useful shorthand to capture an idea, or related set of ideas … Here are a couple of bold assertions, which I think I believe, and which I will try to justify. First, in academic research terms there is only one meaningful division, between mainstream and heterodox … Second, in macroeconomic policy terms I think there is only one meaningful significant division, between mainstream and anti-Keynesians …
So what do I mean by a meaningful division in academic research terms? I mean speaking a different language. Thanks to the microfoundations revolution in macro, mainstream macroeconomists speak the same language. I can go to a seminar that involves an RBC model with flexible prices and no involuntary unemployment and still contribute and possibly learn something.
Wren-Lewis seems to be überjoyed by the fact that using the same language as real business cycles macroeconomists he can “possibly learn something” from them.
Wonder what …
I’m not sure Wren-Lewis uses the same “language” as James Tobin, but he’s definitely worth listening to:
They try to explain business cycles solely as problems of information, such as asymmetries and imperfections in the information agents have. Those assumptions are just as arbitrary as the institutional rigidities and inertia they find objectionable in other theories of business fluctuations … I try to point out how incapable the new equilibrium business cycles models are of explaining the most obvious observed facts of cyclical fluctuations … I don’t think that models so far from realistic description should be taken seriously as a guide to policy … I don’t think that there is a way to write down any model which at one hand respects the possible diversity of agents in taste, circumstances, and so on, and at the other hand also grounds behavior rigorously in utility maximization and which has any substantive content to it.
Arjo Klamer, The New Classical Mcroeconomics: Conversations with the New Classical Economists and their Opponents,Wheatsheaf Books, 1984
And contrary to Wren-Lewis I don’t think the fact that “thanks to the microfoundations revolution in macro, mainstream macroeconomists speak the same language,” takes us very far. Far better than having a common “language” is to have a well-founded, realist and relevant theory:
Microfoundations for macroeconomics are fine in principle—not indispensable, but useful. The problem is that what passes for microfoundations in the universe of orthodox macro is crap …
It’s nothing more than robotic imitation of teaching exercises to improve math skills, without any consideration for such mundane matters as empirical verisimilitude. I will mention three crushing faults, each sufficient by itself to blow a wide hole in a supposedly useful model …
It is rife with anomalies (see “behavioral economics”), and, most important, it is oblivious to the last several decades of work in psychology, evolutionary biology, neuropsychology, organization theory—all the disciplines where people study behavior in a scientific way …
There are no interaction effects to generate multiple equilibria in the microfoundations macro theorists use. Every individual, firm and product is an isolated atom, floating uninterrupted through space until it bumps into another such atom in the marketplace. Social psychology, ecology, nonconvex production and consumption spaces? Forget about it …
Microfoundations means general equilibrium theory, but the flavor it uses is from the mid-1950s. The Sonnenschein-Debreu-Mantel demonstration (update to the 1970s) that initial conditions and out-of-equilibrium trades alter the equilibrium itself has turned GET upside down.
Notice that I haven’t mentioned the standard heterodox criticisms of representative agents and ergodicity. You can add those if you want …
Like I said, their microfoundations are crap.
Macroeconomists have to have bigger aspirations than speaking the same “language.” Rigorous models lacking relevance is not to be taken seriously. Truly great macroeconomists aspire to explain and understand the fundamentals of modern economies. As did e. g. John Maynard Keynes and Michal Kalecki.
To many conservative and libertarian politicians and economists there seems to be a spectre haunting the United States and Europe today — Keynesian ideas on governments pursuing policies raising effective demand and supporting employment. And some of the favourite arguments used among these Keynesophobics to fight it, are the confidence argument and the doctrine of ‘sound finance.’
Is this witless crusade againt economic reason new? Not at all. In 1943 Michal Kalecki wrote in his classic essay Political aspects of full employment:
It should be first stated that, although most economists are now agreed that full employment may be achieved by government spending, this was by no means the case even in the recent past. Among the opposers of this doctrine there were (and still are) prominent so-called ‘economic experts’ closely connected with banking and industry. This suggests that there is a political background in the opposition to the full employment doctrine, even though the arguments advanced are economic. That is not to say that people who advance them do not believe in their economics, poor though this is. But obstinate ignorance is usually a manifestation of underlying political motives …
Clearly, higher output and employment benefit not only workers but entrepreneurs as well, because the latter’s profits rise. And the policy of full employment outlined above does not encroach upon profits because it does not involve any additional taxation. The entrepreneurs in the slump are longing for a boom; why do they not gladly accept the synthetic boom which the government is able to offer them? It is this difficult and fascinating question with which we intend to deal in this article …
We shall deal first with the reluctance of the ‘captains of industry’ to accept government intervention in the matter of employment. Every widening of state activity is looked upon by business with suspicion, but the creation of employment by government spending has a special aspect which makes the opposition particularly intense. Under a laissez-faire system the level of employment depends to a great extent on the so-called state of confidence. If this deteriorates, private investment declines, which results in a fall of output and employment (both directly and through the secondary effect of the fall in incomes upon consumption and investment). This gives the capitalists a powerful indirect control over government policy: everything which may shake the state of confidence must be carefully avoided because it would cause an economic crisis. But once the government learns the trick of increasing employment by its own purchases, this powerful controlling device loses its effectiveness. Hence budget deficits necessary to carry out government intervention must be regarded as perilous. The social function of the doctrine of ‘sound finance’ is to make the level of employment dependent on the state of confidence.
Swedish-Eritrean journalist and writer Dawit Isaak has been held in Eritrean prison for 13 years without trial. He is the only Swedish citizen held as a prisoner of conscience.
Today is his 50th birthday. Let us hope it is the last spent in prison.
Free Dawit Isaak!
The difficulty lies, not in the new ideas, but in the escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds.
John Maynard Keynes
Mark Blaug (1927-2011) did more than any other single person to establish the philosophy and methodology of economics a respected subfield within economics. His path-breaking The methodology of economics (1980) is still a landmark — and the first textbook on economic methodology yours truly had to read as student.
The rejection of specific theoretical arguments does not diminish the achievements of Piketty’s work. Capital is an outstanding work, it has brought issues of wealth and income distribution to the spotlight, where heterodox economists have failed to do so. It has also put together, and made readily available, an invaluable data set, and it allows future researchers to analyse macroeconomics with a much broader time horizon, covering much of the history of capitalism rather than the last few decades. But we do suggest that the analysis of the book would have been strengthened if Piketty had had also considered a post-Keynesian instead of a neoclassical framework.
[h/t Mark Thoma]
When “Capital in the 21st Century” was published in English earlier this year, Thomas Piketty’s book was met with rapt attention and constant conversation. The book was lauded but also faced criticism, particularly from other economists who wanted to fit Piketty’s work into the models they knew well …
A particularly technical and effective critique of Piketty is from Matt Rognlie, a graduate student in economics at the Massachusetts Institute of Technology. Rognlie points out that for capital returns to be consistently higher than the overall growth of the economy—or “r > g” as framed by Piketty—an economy needs to be able to easily substitute capital such as machinery or robots for labor. In the terminology of economics this is called the elasticity of substitution between capital and labor, which needs to be greater than 1 for r to be consistently higher than g. Rognlie argues that most studies looking at this particular elasticity find that it is below 1, meaning a drop in economic growth would result in a larger drop in the rate of return and then g being larger than r. In turn, this means capital won’t earn an increasing share of income and the dynamics laid out by Piketty won’t arise …
Enter the new paper by economists Loukas Karabarbounis and Brent Neiman … Their new paper investigates how depreciation affects the measurement of labor share and the elasticity between capital and labor. Using their data set of labor shares income and a model, Karabarnounis and Neiman show that the gross labor share and the net labor share move in the same direction when the shift is caused by a technological shock—as has been the case, they argue, in recent decades. More importantly for this conversation, they point out that the gross and net elasticities are on the same side of 1 if that shock is technological. In the case of a declining labor share, this means they would both be above 1.
This means Rognlie’s point about these two elasticities being lower than 1 doesn’t hold up if capital is gaining due to a new technology that makes capital cheaper …
In short, this new paper gives credence to one of the key dynamics in Piketty’s “Capital in the 21st Century”—that the returns on capital can be higher than growth in the economy, or r > g.
To me this is only a confirmation of what I wrote earlier this autumn on the issue:
Being able to show that you can get the Piketty results using one or another of the available standard neoclassical growth models is of course — from a realist point of view — of limited value. As usual — the really interesting thing is how in accord with reality are the assumptions you make and the numerical values you put into the model specification.
[h/t Jan Milch]
Microfounded DSGE models standardly assume rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative household/ consumer/producer agents with homothetic and identical preferences, etc., etc. At the same time the models standardly ignore complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc., etc.
Behavioural and experimental economics — not to speak of psychology — show beyond any doubts that “deep parameters” — peoples’ preferences, choices and forecasts — are regularly influenced by those of other participants in the economy. And how about the homogeneity assumption? And if all actors are the same – why and with whom do they transact? And why does economics have to be exclusively teleological (concerned with intentional states of individuals)? Where are the arguments for that ontological reductionism? And what about collective intentionality and constitutive background rules?
These are all justified questions – so, in what way can one maintain that these models give workable microfoundations for macroeconomics? Science philosopher Nancy Cartwright gives a good hint at how to answer that question:
Our assessment of the probability of effectiveness is only as secure as the weakest link in our reasoning to arrive at that probability. We may have to ignore some issues to make heroic assumptions about them. But that should dramatically weaken our degree of confidence in our final assessment. Rigor isn’t contagious from link to link. If you want a relatively secure conclusion coming out, you’d better be careful that each premise is secure going on.
In conclusion, one can say that the sympathy that some of the traditional and Post-Keynesian authors show towards DSGE models is rather hard to understand. Even before the recent financial and economic crisis put some weaknesses of the model – such as the impossibility of generating asset price bubbles or the lack of inclusion of financial sector issues – into the spotlight and brought them even to the attention of mainstream media, the models’ inner working were highly questionable from the very beginning. While one can understand that some of the elements in DSGE models seem to appeal to Keynesians at first sight, after closer examination, these models are in fundamental contradiction to Post-Keynesian and even traditional Keynesian thinking. The DSGE model is a model in which output is determined in the labour market as in New Classical models and in which aggregate demand plays only a very secondary role, even in the short run.
In addition, given the fundamental philosophical problems presented for the use of DSGE models for policy simulation, namely the fact that a number of parameters used have completely implausible magnitudes and that the degree of freedom for different parameters is so large that DSGE models with fundamentally different parametrization (and therefore different policy conclusions) equally well produce time series which fit the real-world data, it is also very hard to understand why DSGE models have reached such a prominence in economic science in general.
Neither New Classical nor “New Keynesian” microfounded DSGE macro models have helped us foresee, understand or craft solutions to the problems of today’s economies. But still most young academic macroeconomists want to work with DSGE models. After reading Dullien’s article, that certainly should be a very worrying confirmation of economics — at least from the point of view of realism and relevance — becoming more and more a waste of time. Why do these young bright guys waste their time and efforts? Besides aspirations of being published, I think maybe Frank Hahn gave the truest answer back in 2005, when interviewed on the occasion of his 80th birthday, he confessed that some economic assumptions didn’t really say anything about “what happens in the world,” but still had to be considered very good “because it allows us to get on this job.”