Paul Romer’s assault on ‘post-real’ economics

30 September, 2016 at 18:46 | Posted in Economics | Comments Off on Paul Romer’s assault on ‘post-real’ economics

wpid-mmb9qajq9swpi8xxy76aIn issue no. 76 of real-world economics review — published today — yours truly has an article on Paul Romer’s frontal attack on the New Classical and ‘New Keynesian’ theories that has put macroeconomics on a path of intellectual regress for more than two decades now.

Advertisements

The fundamental flaw of econometrics

30 September, 2016 at 18:07 | Posted in Statistics & Econometrics | 8 Comments

It is often said that the error term in a regression equation represents the effect of the variables that were omitted from the equation. This is unsatisfactory …

There is no easy way out of the difficulty. The conventional interpretation for error terms needs to be reconsidered. At a minimum, something like this would need to be said:

The error term represents the combined effect of the omitted variables, assuming that
(i) the combined effect of the omitted variables is independent of each variable included in the equation,
(ii) the combined effect of the omitted variables is independent across subjects,
(iii) the combined effect of the omitted variables has expectation 0.

This is distinctly harder to swallow.

David Freedman

Yes, indeed, that is harder to swallow.

Those conditions on the error term actually means that we are being able to construct a model where all relevant variables are included and correctly specify the functional relationships that exist between them.

But that is actually impossible to fully manage in reality!

The theories we work with when building our econometric regression models are insufficient. No matter what we study, there are always some variables missing, and we don’t know the correct way to functionally specify the relationships between the variables (usually just assuming linearity).

Every regression model constructed is misspecified. There are always an endless list of possible variables to include, and endless possible ways to specify the relationships between them. So every applied econometrician comes up with his own specification and ‘parameter’ estimates. No wonder that the econometric Holy Grail of consistent and stable parameter-values is still nothing but a dream.

overconfidenceIn order to draw inferences from data as described by econometric texts, it is necessary to make whimsical assumptions. The professional audience consequently and properly withholds belief until an inference is shown to be adequately insensitive to the choice of assumptions. The haphazard way we individually and collectively study the fragility of inferences leaves most of us unconvinced that any inference is believable. If we are to make effective use of our scarce data resource, it is therefore important that we study fragility in a much more systematic way. If it turns out that almost all inferences from economic data are fragile, I suppose we shall have to revert to our old methods …

Ed Leamer

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables.  Parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

Real world social systems are not governed by stable causal mechanisms or capacities. As Keynes noticed when he first launched his attack against econometrics and inferential statistics already in the 1920s:

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of economic theoretical modeling – rather useless.

statRegression models are widely used by social scientists to make causal inferences; such models are now almost a routine way of demonstrating counterfactuals. However, the “demonstrations” generally turn out to depend on a series of untested, even unarticulated, technical assumptions. Under the circumstances, reliance on model outputs may be quite unjustified. Making the ideas of validation somewhat more precise is a serious problem in the philosophy of science. That models should correspond to reality is, after all, a useful but not totally straightforward idea – with some history to it. Developing appropriate models is a serious problem in statistics; testing the connection to the phenomena is even more serious …

In our days, serious arguments have been made from data. Beautiful, delicate theorems have been proved, although the connection with data analysis often remains to be established. And an enormous amount of fiction has been produced, masquerading as rigorous science.

The theoretical conditions that have to be fulfilled for regression analysis and econometrics to really work are nowhere even closely met in reality. Making outlandish statistical assumptions does not provide a solid ground for doing relevant social science and economics. Although regression analysis and econometrics have become the most used quantitative methods in social sciences and economics today, it’s still a fact that the inferences made from them are invalid.

Price stickiness is NOT the problem

29 September, 2016 at 17:43 | Posted in Economics | Comments Off on Price stickiness is NOT the problem

‘New Keynesian’ macroeconomists have for years been arguing (e.g. here) about the importance of the New Classical Counter Revolution in economics. ‘Helping’ to change the way macroeconomics is done today — with rational expectations, Euler equations, intertemporal optimization and microfoundations — their main critique of New Classical macroeconomics is that it didn’t incorporate price stickiness into the Real Business Cycles models developed by the New Classicals. So — the ‘New Keynesians’ adopted the methodology suggested by the New Classcials and just added price stickiness!

6a00d8341c652b53ef01630539828a970d-800wiBut does putting a sticky-price DSGE lipstick on the RBC pig really help?

It sure doesn’t!

I have elaborated on why not in chapter three of my On the use and misuse of theories and models in mainstream economics, and David Glasner gives some further reasons why a pig with lipstick is still a pig:

In the General Theory, Keynes argued that if you believed in the standard story told by microeconomics about how prices constantly adjust to equate demand and supply and maintain equilibrium, then maybe you should be consistent and follow the Mises/Robbins story and just wait for the price mechanism to perform its magic, rather than support counter-cyclical monetary and fiscal policies. So Keynes then argued that there is actually something wrong with the standard microeconomic story; price adjustments can’t ensure that overall economic equilibrium is restored, because the level of employment depends on aggregate demand, and if aggregate demand is insufficient, wage cutting won’t increase – and, more likely, would reduce — aggregate demand, so that no amount of wage-cutting would succeed in reducing unemployment …

The real problem is not that prices are sticky but that trading takes place at disequilibrium prices and there is no mechanism by which to discover what the equilibrium prices are. Modern macroeconomics solves this problem, in its characteristic fashion, by assuming it away by insisting that expectations are “rational.”

Economists have allowed themselves to make this absurd assumption because they are in the habit of thinking that the simple rule of raising price when there is an excess demand and reducing the price when there is an excess supply inevitably causes convergence to equilibrium. This habitual way of thinking has been inculcated in economists by the intense, and largely beneficial, training they have been subjected to in Marshallian partial-equilibrium analysis, which is built on the assumption that every market can be analyzed in isolation from every other market. But that analytic approach can only be justified under a very restrictive set of assumptions. In particular it is assumed that any single market under consideration is small relative to the whole economy, so that its repercussions on other markets can be ignored, and that every other market is in equilibrium, so that there are no changes from other markets that are impinging on the equilibrium in the market under consideration …

I regard the term “sticky prices” and other similar terms as very unhelpful and misleading; they are a kind of mental crutch that economists are too ready to rely on as a substitute for thinking about what are the actual causes of economic breakdowns, crises, recessions, and depressions. Most of all, they represent an uncritical transfer of partial-equilibrium microeconomic thinking to a problem that requires a system-wide macroeconomic approach. That approach should not ignore microeconomic reasoning, but it has to transcend both partial-equilibrium supply-demand analysis and the mathematics of intertemporal optimisation.

David Glasner

Homecoming (personal)

29 September, 2016 at 13:26 | Posted in Economics | 3 Comments

p1110311

After almost forty years in Lund, yours truly has returned to the town where he was born and bred — Malmö. Living on the top floor of this grandiose building — next to The Magistrate’s Park, and with The Opera and The Municipal Art Gallery just across the street — makes it easy to convince me returning was a good decision …

My philosophy of economics

29 September, 2016 at 12:39 | Posted in Economics | Comments Off on My philosophy of economics

A critique yours truly sometimes encounters is that as long as I cannot come up with some own alternative to the failing mainstream theory, I shouldn’t expect people to pay attention.

This is however to totally and utterly misunderstand the role of philosophy and methodology of economics!

As John Locke wrote in An Essay Concerning Human Understanding:

19557-004-21162361The Commonwealth of Learning is not at this time without Master-Builders, whose mighty Designs, in advancing the Sciences, will leave lasting Monuments to the Admiration of Posterity; But every one must not hope to be a Boyle, or a Sydenham; and in an Age that produces such Masters, as the Great-Huygenius, and the incomparable Mr. Newton, with some other of that Strain; ’tis Ambition enough to be employed as an Under-Labourer in clearing Ground a little, and removing some of the Rubbish, that lies in the way to Knowledge.

That’s what philosophy and methodology can contribute to economics — clear obstacles to science.

respectEvery now and then I also get some upset comments from people wondering why I’m not always ‘respectful’ of people like Eugene Fama, Robert Lucas, Greg Mankiw, and others of the same ilk.

But sometimes it might actually, from a Lockean perspective, be quite appropriate to be disrespectful.

New Classical and ‘New Keynesian’ macroeconomics is rubbish that ‘lies in the way to Knowledge.’

And when New Classical and ‘New Keynesian’economists resurrect fallacious ideas and theories that were proven wrong already in the 1930s, then I think a less respectful and more colourful language is called for.

New Classical macroeconomics — elegant fantasies

28 September, 2016 at 12:46 | Posted in Economics | 1 Comment

65974000The crucial issue of macroeconomic theory today is the same as it was sixty years ago when John Maynard Keynes revolted against what he called the “classical” orthodoxy of his day. It is a shame that there are still “schools” of economic doctrine, but perhaps controversies are inevitable when the issues involve policy, politics, and ideology and elude decisive controlled experiments. As a lifelong Keynesian, I am quite dismayed by the prevalence in my profession today, in a particularly virulent form, of the macroeconomic doctrines against which I as a student enlisted in the Keynesian revolution. Their high priests call themselves New Classicals and refer to their explanation of fluctuations in economic activity as Real Business Cycle Theory. I guess “Real” is intended to mean “not monetary” rather than “not false,” but maybe both.

I am going to discuss the issues of theory, Keynesian versus Classical, both then and now … The doctrinal differences stand out most clearly in opposing diagnoses of the fluctuations in output and employment to which democratic capitalist societies like our own are subject, and in what remedies, if any, are prescribed. Keynesian theory regards recessions as lapses from full-employment equilibrium, massive economy-wide market failures resulting from shortages of aggregate demand for goods and services and for the labor to produce them. Modern “real business cycle theory” interprets fluctuations a moving equilibrium, individually and socially rational responses to unavoidable exogenous shocks. The Keynesian logic leads its adherents to advocate active fiscal and monetary policies to restore and maintain full employment. From real business cycle models, and other theories in the New Classical spirit, the logical implication is that no policy interventions are necessary or desirable …

Keynesians believe that the economy is sometimes in one regime, sometimes in the other. New Classicals model the economy as always supply-constrained and in supply-equals-demand equilibrium. In their real business cycle models, the shocks that move economic activity up and down are essentially supply shocks, changes in technology and productivity or in the bounty of nature or in the costs and supplies of imported products. Although external forces of those kinds, for example weather, harvests, natural catastrophes, have been the main sources of fluctuating fortunes for most of human history, and although events continually remind us that they still occur, Keynesians do not agree that they are the main source of fluctuations in business activity in modern capitalist societies …

Fancy econometrics is not needed to mobilize evidence against the real business cycle theory view that observed fluctuations in output and employment are movements in priced-cleared equilibrium. Here are a number of regularities of US business cycles which falsify that hypothesis:

1. Unemployment itself. If people are voluntarily choosing not to work at prevailing wages, why do they report themselves as unemployed, rather than as ‘not in the labour force’? Real business cycle theory explains fluctuatiuons of unemployment as intertemporal choice between work and leisure. Workers drop out when the real wages, the opportunity costs of leisure, are temporarily low relative to what they expect later …

2. Unemployment and vacancies. New classicals ask us to believe that the labour market is in equilibrium at 9 per cent unemployment just as truly as it is at 5 per cent. If so, there would be no reason to expect the balance between unemployment and job vacancies to differ. Both unemployment and vacancies would be higher in recession. However, a strong negative association between unemployment and vacancies  — as would be expected in Keynesian theory — is obvious in the U.S. and other market capitalist economies.

3. Quits and layoffs. If recessions and prosperities are both supply-equals-demand equilibria, there is no reason to expect the relative frequencies of voluntary quits of jobs and involuntary ‘separations’ from jobs to vary over the business cycle. But of course there are regularly many more layoffs, relative to quits, when unemployment is high and vacancies are scarce. There are many more ‘job losers’ relative to ‘job leavers’ in recessions.

4. Excess capacity. Utilization of plant and equipment varies cyclically, parallel to utilization of labour. Presumably machines do not choose leisure voluntarily.

5. Unfilled orders and delays. These move pro-cyclically, again suggesting strongly that demand is much higher relative to supply in prosperities than in recessions.

6. Monetary effects on output. According to the ‘classical dichotomy,’ monetary events and policies should affect only nominal prices. Real outcomes should be independent of them. The evidence that this is not true is overwhelming.

The list could go on. Why do so many talented economic theorists believe and teach elegant fantasies so obviously refutable by plainly evident facts?

James Tobin

The Nobel factor — the prize in economics that spearheaded the neoliberal revolution

27 September, 2016 at 19:34 | Posted in Economics | 3 Comments

The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel, usually — incorrectly — referred to as the Nobel Prize in Economics, is an award for outstanding contributions to the field of economics. The Prize in Economics was established and endowed by Sweden’s central bank Sveriges Riksbank in 1968 on the occasion of the bank’s 300th anniversary. The first award was given in 1969. The award this year is presented in Stockholm at a ceremony on Monday 10 October.

k10841Avner Offer’s and Gabriel Söderberg’s new book — The Nobel factor: the prize in economics, social democracy, and the market turn (Princeton University Press 2016) — tells the story of how the prize emerged from a conflict between the Swedish central bank — Sveriges Riksbank — and social democracy. It is no pure coincidence that the ascendancy of market liberalism, Reagan and Thatcher, to a large part coincides with the creation and establishment of the prize. Especially during the despotic Assar Lindbeck’s long chairmanship — 1980-1994 — the prize was thought to take advantage of the connection with the true Nobel prizes and spearhead a market-oriented neoliberal reshaping of the world. Although not all economists who have got the prize have enlisted in the market-liberal crusade, it is still an undeniable fact that neoliberal and conservative leaning male economists are highly over-represented among the laureates. Their often ideologically biased doctrines have to a large extent motivated the neoliberal turn in economic policies for more than forty years.

Out of the 76 laureates that have been awarded “The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel,” 28 have been affiliated to The University of Chicago — that is 37 %. Of all laureates, 80% have been from the US (by birth or by naturalisation). Only 7% of the laureates have come from outside North America or Western Europe. Only 1 woman has got the prize. The world is really a small place when it comes to economics …

Looking at whom the prize is given to, says quite a lot about what kind of prize this is. Offer and Söderberg do that, but looking at whom the prize is not given to, says perhaps even more.

The great Romanian-American mathematical statistician and economist Nicholas Georgescu-Roegen (1906-1994) argued in his epochal The Entropy Law and the Economic Process (1971) that the economy was actually a giant thermodynamic system in which entropy increases inexorably and our material basis disappears. If we choose to continue to produce with the techniques we have developed, then our society and earth will disappear faster than if we introduce small-scale production, resource-saving technologies and limited consumption.

Following Georgescu-Roegen, ecological economists have argued that industrial society inevitably leads to increased environmental pollution, energy crisis and an unsustainable growth.

Georgescu-Roegen and ecological economics have turned against the neoclassical theory’s obsession with purely monetary factors. The monetary reductionism easily makes you ignore other factors having a bearing on human interaction with the environment.

I wonder if this isn’t the crux of the matter. To assert such a thing really is to swear in the neoclassical establishment church and nullifies any chances of getting the prestigious prize.

Twenty years ago, after a radio debate with one of the members of the prize committee — Ingmar Ståhl — I asked why Georgescu-Roegen hadn’t got the prize. The answer was – mirabile dictu – that he “never founded a school.” I was surprised, to say the least, and wondered if he possibly had heard of the environmental movement. Well, he had — but it was “the wrong kind of school”! Can it be stated much clearer than this what it’s all about? If you haven’t worked within the mainstream neoclassical paradigm — then you are more or less excluded a priori from being eligible for the The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel!

Three years ago — making an extraordinarily successful forecast — I told Swedish media the prize committee would show how in tune with the times it was and award the prize to Eugene Fama. Why? Well — I argued — he’s a Chicago economist and a champion of rational expectations and efficient markets. And nowadays freshwater economists seem to be the next to the only ones eligible for the prize. And, of course, an economist who has described the notion that finance theory was at fault as “a fantasy” and argued that “financial markets and financial institutions were casualties rather than causes of the recession” had to appeal to a prize committee with a history of awarding theories and economists totally lacking any real world relevance.

Well, my forecast turned out to be right — the Swedish Academy of Sciences awarded The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel for 2013 to Eugene Fame. The prize committee really did show how in tune with the times it was …

I love to be right of course, but otherwise this is only saddening and shows what a joke this prize is, when someone like Fama can get it.

The ‘Nobel prize’ in economics is — and has always been — a total disaster from both a scientific and social point of view, and after having read Offer’s and Söderberg’s book, there is, at least to me, only one conclusion to draw, and that is: Drop it!

Good advice to aspiring economists

26 September, 2016 at 16:42 | Posted in Economics | 1 Comment

225px-allais_pn_maurice-24x30-2001bSubmission to observed or experimental data is the golden rule which dominates any scientific discipline. Any theory whatever, if it is not verified by empirical evidence, has no scientific value and should be rejected.

Maurice Allais

 

Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality.

Mainstream — neoclassical — economics has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.

To have valid evidence is not enough. What economics needs is sound evidence. Why? Simply because the premises of a valid argument do not have to be true, but a sound argument, on the other hand, is not only valid, but builds on premises that are true. Aiming only for validity, without soundness, is setting the economics aspirations level too low for developing a realist and relevant science.

DSGE macroeconomics — overconfident story-telling

25 September, 2016 at 12:53 | Posted in Economics | 6 Comments

We economists trudge relentlessly toward Asymptopia, where data are unlimited and estimates are consistent, where the laws of large numbers apply perfectly and where the full intricacies of the economy are completely revealed … Worst of all, when we feel pumped up with our progress, a tectonic shift can occur, like the Panic of 2008, making it seem as though our long journey has left us disappointingly close to the State of Complete Ignorance whence we began …

overconfidenceWe may listen, but we don’t hear, when the Priests warn that the new direction is only for those with Faith, those with complete belief in the Assumptions of the Path. It often takes years down the Path, but sooner or later, someone articulates the concerns that gnaw away in each of us and asks if the Assumptions are valid …

It would be much healthier for all of us if we could accept our fate, recognize that perfect knowledge will be forever beyond our reach and find happiness with what we have …

Can we economists agree that it is extremely hard work to squeeze truths from our data sets and what we genuinely understand will remain uncomfortably limited? We need words in our methodological vocabulary to express the limits … Those who think otherwise should be required to wear a scarlet-letter O around their necks, for “overconfidence.”

Econometric theory promises more than it can deliver, because it requires a complete commitment to assumptions that are actually only half-heartedly maintained …

Our understanding of causal effects in macroeconomics is virtually nil, and will remain so. Don’t we know that? … The economists who coined the DSGE acronym combined in three terms the things economists least understand: “dynamic,” standing for forward-looking decision making; “stochastic,” standing for decisions under uncertainty and ambiguity; and “general equilibrium,” standing for the social process that coordinates and in uences the actions of all the players. I have tried to make this point in the title of my recent book Macroeconomic Patterns and Stories. That’s what we do. We seek patterns and tell stories.

Ed Leamer

Solow on post-real Chicago economics

25 September, 2016 at 11:02 | Posted in Economics | Comments Off on Solow on post-real Chicago economics

As yours truly wrote last week, there has been much discussion going on in the economics academia on Paul Romer’s recent critique of ‘modern’ macroeconomics.

But the rhetorical swindle that New Classical and ‘New Keynesian’ macroeconomics have tried to impose upon us with their microfounded calibrations and DSGE models, has not gone unnoticed until Paul Romer came along:

robert_solow4I think that Professors Lucas and Sargent really seem to be serious in what they say, and in turn they have a proposal for constructive research that I find hard to talk about sympathetically. They call it equilibrium business cycle theory, and they say very firmly that it is based on two terribly important postulates — optimizing behavior and perpetual market clearing. When you read closely, they seem to regard the postulate of optimizing behavior as self-evident and the postulate of market-clearing behavior as essentially meaningless. I think they are too optimistic, since the one that they think is self-evident I regard as meaningless and the one that they think is meaningless, I regard as false. The assumption that everyone optimizes implies only weak and uninteresting consistency conditions on their behavior. Anything useful has to come from knowing what they optimize, and what constraints they perceive. Lucas and Sargent’s casual assumptions have no special claim to attention …

It is plain as the nose on my face that the labor market and many markets for produced goods do not clear in any meaningful sense. Professors Lucas and Sargent say after all there is no evidence that labor markets do not clear, just the unemployment survey. That seems to me to be evidence. Suppose an unemployed worker says to you “Yes, I would be glad to take a job like the one I have already proved I can do because I had it six months ago or three or four months ago. And I will be glad to work at exactly the same wage that is being paid to those exactly like myself who used to be working at that job and happen to be lucky enough still to be working at it.” Then I’m inclined to label that a case of excess supply of labor and I’m not inclined to make up an elaborate story of search or misinformation or anything of the sort. By the way I find the misinformation story another gross implausibility. I would like to see direct evidence that the unemployed are more misinformed than the employed, as I presume would have to be the case if everybody is on his or her supply curve of employment … Now you could ask, why do not prices and wages erode and crumble under those circumstances? Why doesn’t the unemployed worker who told me “Yes, I would like to work, at the going wage, at the old job that my brother-in-law or my brother-in-law’s brother-in-law is still holding”, why doesn’t that person offer to work at that job for less? Indeed why doesn’t the employer try to encourage wage reduction? That doesn’t happen either … Those are questions that I think an adult person might spend a lifetime studying. They are important and serious questions, but the notion that the excess supply is not there strikes me as utterly implausible.

Robert Solow

The eminently quotable Solow — as always — says it all.

The purported strength of New Classical and ‘New Keynesian’ macroeconomics is that they have firm anchorage in preference-based microeconomics, and especially the decisions taken by inter-temporal utility maximizing ‘forward-loooking’ individuals.

To some of us, however, this has come at too high a price. The almost quasi-religious insistence that macroeconomics has to have microfoundations — without ever presenting neither ontological nor epistemological justifications for this claim — has put a blind eye to the weakness of the whole enterprise of trying to depict a complex economy based on an all-embracing representative actor equipped with superhuman knowledge, forecasting abilities and forward-looking rational expectations.

That anyone should take that kind of ludicrous stuff seriously is totally and unbelievably ridiculous. Or as Solow has it:

4703325Suppose someone sits down where you are sitting right now and announces to me that he is Napoleon Bonaparte. The last thing I want to do with him is to get involved in a technical discussion of cavalry tactics at the battle of Austerlitz. If I do that, I’m getting tacitly drawn into the game that he is Napoleon. Now, Bob Lucas and Tom Sargent like nothing better than to get drawn into technical discussions, because then you have tacitly gone along with their fundamental assumptions; your attention is attracted away from the basic weakness of the whole story. Since I find that fundamental framework ludicrous, I respond by treating it as ludicrous – that is, by laughing at it – so as not to fall into the trap of taking it seriously and passing on to matters of technique.

Robert Solow

Rethinking macroeconomic theory

24 September, 2016 at 19:07 | Posted in Economics | Comments Off on Rethinking macroeconomic theory

Several mainstream economists still believe that ‘any interesting model must be a dynamic stochastic general equilibrium model. From this perspective, there is no other game in town. If you have an interesting and coherent story to tell, you can tell it in a DSGE model. If you cannot, your story is incoherent’ (V.V. Chair). Similarly, not very long ago, Blanchard (2014, p. 31) was affirming that the solution to previous mistakes was that ‘DSGE models should be expanded to better recognize the role of the financial system.’ It is hard to see how conceiving of the financial system as frictions to the real economy can help in understanding macroeconomics. fubar1Simon Wren-Lewis (2016, p. 33) is adamant that the methodology proposed by what he calls the New Classical Counter Revolution, enclosed into DSGE and New Keynesian models, is a worthy one, and that as a consequence ‘the microfoundations methodology is entrenched … so it is unlikely that its practitioners will down tools and start afresh’. Wren-Lewis further claims says that ‘the methodology is progressive’, and like Blanchard he believes that ‘researchers are devoting a good deal of time to examining real/financial interactions’ (2016, p. 30), so that all is well as long as Keynesian results are not excluded by assumption and can be recovered from the simulations.

Marc Lavoie

Post-real macroeconomics — science as fraud

24 September, 2016 at 14:00 | Posted in Economics | Comments Off on Post-real macroeconomics — science as fraud

There are many kinds of useless economics held in high regard within mainstream economics establishment today . Few — as Paul Romer recently has been arguing — are less deserved than the post-real macroeconomic theory – mostly connected with Nobel laureates Finn Kydland, Robert Lucas,  Edward Prescott and Thomas Sargent – called calibration.

In an interview by Seppo Honkapohja and Lee Evans (Macroeconomic Dynamics 2005, vol. 9) Thomas Sargent says:

fraud-kitCalibration is less optimistic about what your theory can accomplish because you would only use it if you din’t fully trust your entire model, meaning that you think your model is partly misspecified or incompetely specified, or if you trusted someone else’s model and data set more than your own. My recollection is that Bob Lucas and Ed Prescott were initially very enthusiastic about rational expetations econometrics. After all, it simply involved imposing on ourselves the same high standards we had criticized the Keynesians for failing to live up to. But after about five years of doing likelihood ratio tests on rational expectations models, I recall Bob Lucas and Ed Prescott both telling me that those tests were rejecting too many good models. The idea of calibration is to ignore some of the probabilistic implications of your model but to retain others. Somehow, calibration was intended as a balanced response to professing that your model, although not correct, is still worthy as a vehicle for quantitative policy analysis….

It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity and truth nothing. Why anyone should be interested in that kind of theories and models — as long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live — is beyond comprehension. Stupid models are of no or little help in understanding the real world.

In Chicago economics one is cultivating the view that scientific theories has nothing to do with truth. Constructing theories and building models is not even considered an activity wth the intent of  approximating truth. For Chicago economists like Lucas and Sargent it is only an endeavour to organize their thoughts in a ‘useful’ manner.

What a handy view of science!

What Sargent and other defenders of scientific storytelling ‘forgets’ is that potential explanatory power achieved in thought experimental models is not enough for attaining real explanations. Model explanations are at best conjectures, and whether they do or do not explain things in the real world is something we have to test. To just believe that you understand or explain things better with thought experiments is not enough. Without a warranted export certificate to the real world, model explanations are pretty worthless. Proving things in models is not enough. Truth is an important concept in real science.

Paul Romer on macroeconomics as a religious dogma

23 September, 2016 at 19:11 | Posted in Economics | Comments Off on Paul Romer on macroeconomics as a religious dogma

 

The dividing line between bad and good macroeconomics

23 September, 2016 at 18:44 | Posted in Economics | 2 Comments

paul_romer_in_2005If I am right that in recent decades the equilibrium in post-real macro has discouraged good science … there is some risk that a rear-guard of post-real macroeconomists will continue to defend their notion of methodological purity. At this point it is hard to know whether this group will fracture or dig in for a fight to death. If they dig in, I suspect that it will be in a few departments and that the variation between departments will be larger. Watch to see how this plays out and choose where you go with this in mind.

To learn about a department, visit and ask macroeconomists you meet “honestly, what do you think was the cause of the recessions of 1980 and 1982.” If they say anything other than “Paul Volcker caused them to bring inflation down,” treat this as at least a yellow caution flag. Then ask about cause of the Great Depression. If they start telling you about the anti-market policies of the New Deal, listen politely and scratch this department off your list.

Paul Romer

James Tobin on post-real macroeconomics

23 September, 2016 at 17:36 | Posted in Economics | Comments Off on James Tobin on post-real macroeconomics

James Tobin explained why real business cycle theory and microfounded DSGE models are such a total waste of time. Thirty years before Paul Romer. Maybe one should start teaching some history of economic thought at economics departments again? Just a thought …

They try to explain business cycles solely as problems of information, such as asymmetries and imperfections in the information agents have. Those assumptions are just as arbitrary as the institutional rigidities and inertia they find objectionable in other theories of business fluctuations … I try to point out how incapable the new equilibrium business cycles models are of explaining the most obvious observed facts of cyclical fluctuations … I don’t think that models so far from realistic description should be taken seriously as a guide to policy … I don’t think that there is a way to write down any model which at one hand respects the possible diversity of agents in taste, circumstances, and so on, and at the other hand also grounds behavior rigorously in utility maximization and which has any substantive content to it.

Arjo Klamer, The New Classical Mcroeconomics: Conversations with the New Classical Economists and their  Opponents,Wheatsheaf Books, 1984

Current macro debate

23 September, 2016 at 08:39 | Posted in Economics | 2 Comments

– The models are rubbish.
– Don’t be silly. There’s a paper from the 1980s on learning effects.

Jo Michell

‘Modern’ macroeconomics — a costly waste of time

22 September, 2016 at 16:58 | Posted in Economics | 2 Comments

Commenting on the state of standard modern macroeconomics, Willem Buiter argues that neither New Classical nor New Keynesian microfounded DSGE macro models have helped us foresee, understand or craft solutions to the problems of today’s economies:

buiterThe Monetary Policy Committee of the Bank of England I was privileged to be a ‘founder’ external member of during the years 1997-2000 contained, like its successor vintages of external and executive members, quite a strong representation of academic economists and other professional economists with serious technical training and backgrounds. This turned out to be a severe handicap when the central bank had to switch gears and change from being an inflation-targeting central bank under conditions of orderly financial markets to a financial stability-oriented central bank under conditions of widespread market illiquidity and funding illiquidity. Indeed, the typical graduate macroeconomics and monetary economics training received at Anglo-American universities during the past 30 years or so, may have set back by decades serious investigations of aggregate economic behaviour and economic policy-relevant understanding. It was a privately and socially costly waste of time and other resources.

Most mainstream macroeconomic theoretical innovations since the 1970s … have turned out to be self-referential, inward-looking distractions at best. Research tended to be motivated by the internal logic, intellectual sunk capital and aesthetic puzzles of established research programmes rather than by a powerful desire to understand how the economy works …

Both the New Classical and New Keynesian complete markets macroeconomic theories not only did not allow questions about insolvency and illiquidity to be answered. They did not allow such questions to be asked …

Charles Goodhart, who was fortunate enough not to encounter complete markets macroeconomics and monetary economics during his impressionable, formative years, but only after he had acquired some intellectual immunity, once said of the Dynamic Stochastic General Equilibrium approach which for a while was the staple of central banks’ internal modelling: “It excludes everything I am interested in”. He was right. It excludes everything relevant to the pursuit of financial stability.

The Bank of England in 2007 faced the onset of the credit crunch with too much Robert Lucas, Michael Woodford and Robert Merton in its intellectual cupboard. A drastic but chaotic re-education took place and is continuing.

I believe that the Bank has by now shed the conventional wisdom of the typical macroeconomics training of the past few decades. In its place is an intellectual potpourri of factoids, partial theories, empirical regularities without firm theoretical foundations, hunches, intuitions and half-developed insights. It is not much, but knowing that you know nothing is the beginning of wisdom.

Reading Buiter’s article is certainly a very worrying confirmation of what Paul Romer wrote last week. Modern macroeconomics is becoming more and more a total waste of time.

But why are all these macro guys wasting their time and efforts on these models? Besides simply having the usual aspirations of being published, I think maybe Frank Hahn gave the truest answer back in 2005, when interviewed on the occasion of his 80th birthday, he confessed that some economic assumptions didn’t really say anything about “what happens in the world,” but still had to be considered very good “because it allows us to get on this job.”

Hahn’s suggestion reminds me of an episode, twenty years ago, when Phil Mirowski was invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the mainstream neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defense. Being at a nonplus, one of them, in total desperation, finally asked “But what shall we do then?”

Yes indeed — what shall they do when their emperor has turned out to be naked?

Stiglitz and the demise of marginal productivity theory

22 September, 2016 at 15:36 | Posted in Economics | 7 Comments

Today the trend to greater equality of incomes which characterised the postwar period has been reversed. Inequality is now rising rapidly. Contrary to the rising-tide hypothesis, the rising tide has only lifted the large yachts, and many of the smaller boats have been left dashed on the rocks. This is partly because the extraordinary growth in top incomes has coincided with an economic slowdown.

economic-mythThe trickle-down notion— along with its theoretical justification, marginal productivity theory— needs urgent rethinking. That theory attempts both to explain inequality— why it occurs— and to justify it— why it would be beneficial for the economy as a whole. This essay looks critically at both claims. It argues in favour of alternative explanations of inequality, with particular reference to the theory of rent-seeking and to the influence of institutional and political factors, which have shaped labour markets and patterns of remuneration. And it shows that, far from being either necessary or good for economic growth, excessive inequality tends to lead to weaker economic performance. In light of this, it argues for a range of policies that would increase both equity and economic well-being.

Joseph Stiglitz

Mainstream economics textbooks usually refer to the interrelationship between technological development and education as the main causal force behind increased inequality. If the educational system (supply) develops at the same pace as technology (demand), there should be no increase, ceteris paribus, in the ratio between high-income (highly educated) groups and low-income (low education) groups. In the race between technology and education, the proliferation of skilled-biased technological change has, however, allegedly increased the premium for the highly educated group.

Another prominent explanation is that globalization – in accordance with Ricardo’s theory of comparative advantage and the Wicksell-Heckscher-Ohlin-Stolper-Samuelson factor price theory – has benefited capital in the advanced countries and labour in the developing countries. The problem with these theories are that they explicitly assume full employment and international immobility of the factors of production. Globalization means more than anything else that capital and labour have to a large extent become mobile over country borders. These mainstream trade theories are really not applicable in the world of today, and they are certainly not able to explain the international trade pattern that has developed during the last decades. Although it seems as though capital in the developed countries has benefited from globalization, it is difficult to detect a similar positive effect on workers in the developing countries.

There are, however, also some other quite obvious problems with these kinds of inequality explanations. The World Top Incomes Database shows that the increase in incomes has been concentrated especially in the top 1%. If education was the main reason behind the increasing income gap, one would expect a much broader group of people in the upper echelons of the distribution taking part of this increase. It is dubious, to say the least, to try to explain, for example, the high wages in the finance sector with a marginal productivity argument. High-end wages seem to be more a result of pure luck or membership of the same ‘club’ as those who decide on the wages and bonuses, than of ‘marginal productivity.’

Mainstream economics, with its technologically determined marginal productivity theory, seems to be difficult to reconcile with reality. Although card-carrying neoclassical apologetics like Greg Mankiw want to recall John Bates Clark’s (1899) argument that marginal productivity results in an ethically just distribution, that is not something – even if it were true – we could confirm empirically, since it is impossible realiter to separate out what is the marginal contribution of any factor of production. The hypothetical ceteris paribus addition of only one factor in a production process is often heard of in textbooks, but never seen in reality.

When reading  mainstream economists like Mankiw who argue for the ‘just desert’ of the 0.1 %, one gets a strong feeling that they are ultimately trying to argue that a market economy is some kind of moral free zone where, if left undisturbed, people get what they ‘deserve.’ To most social scientists that probably smacks more of being an evasive action trying to explain away a very disturbing structural ‘regime shift’ that has taken place in our societies. A shift that has very little to do with ‘stochastic returns to education.’ Those were in place also 30 or 40 years ago. At that time they meant that perhaps a top corporate manager earned 10–20 times more than ‘ordinary’ people earned. Today it means that they earn 100–200 times more than ‘ordinary’ people earn. A question of education? Hardly. It is probably more a question of greed and a lost sense of a common project of building a sustainable society.

Since the race between technology and education does not seem to explain the new growing income gap – and even if technological change has become more and more capital augmenting, it is also quite clear that not only the wages of low-skilled workers have fallen, but also the overall wage share – mainstream economists increasingly refer to ‘meritocratic extremism,’ ‘winners-take-all markets’ and ‘super star-theories’ for explanation. But this is also highly questionable.

Fans may want to pay extra to watch top-ranked athletes or movie stars performing on television and film, but corporate managers are hardly the stuff that people’s dreams are made of – and they seldom appear on television and in the movie theaters.

Everyone may prefer to employ the best corporate manager there is, but a corporate manager, unlike a movie star, can only provide his services to a limited number of customers. From the perspective of ‘super-star theories,’ a good corporate manager should only earn marginally better than an average corporate manager. The average earnings of corporate managers of the 50 biggest Swedish companies today, is equivalent to the wages of 46 blue-collar workers.

It is difficult to see the takeoff of the top executives as anything else but a reward for being a member of the same illustrious club. That they should be equivalent to indispensable and fair productive contributions – marginal products – is straining credulity too far. That so many corporate managers and top executives make fantastic earnings today, is strong evidence the theory is patently wrong and basically functions as a legitimizing device of indefensible and growing inequalities.

No one ought to doubt that the idea that capitalism is an expression of impartial market forces of supply and demand, bears but little resemblance to actual reality. Wealth and income distribution, both individual and functional, in a market society is to an overwhelmingly high degree influenced by institutionalized political and economic norms and power relations, things that have relatively little to do with marginal productivity in complete and profit-maximizing competitive market models – not to mention how extremely difficult, if not outright impossible it is to empirically disentangle and measure different individuals’ contributions in the typical team work production that characterize modern societies; or, especially when it comes to ‘capital,’ what it is supposed to mean and how to measure it. Remunerations do not necessarily correspond to any marginal product of different factors of production – or to ‘compensating differentials’ due to non-monetary characteristics of different jobs, natural ability, effort or chance.

Put simply – highly paid workers and corporate managers are not always highly productive workers and corporate managers, and less highly paid workers and corporate managers are not always less productive. History has over and over again disconfirmed the close connection between productivity and remuneration postulated in mainstream income distribution theory.

Neoclassical marginal productivity theory is a collapsed theory from both a historical and a theoretical point of view, as shown already by Sraffa in the 1920s, and in the Cambridge capital controversy in the 1960s and 1970s. As Joan Robinson wrote in 1953:

joan robinsonThe production function has been a powerful instrument of miseducation. The student of economic theory is taught to write Q = f (L, K) where L is a quantity of labor, K a quantity of capital and Q a rate of output of commodities. He is instructed to assume all workers alike, and to measure L in man-hours of labor; he is told something about the index-number problem in choosing a unit of output; and then he is hurried on to the next question, in the hope that he will forget to ask in what units K is measured. Before he ever does ask, he has become a professor, and so sloppy habits of thought are handed on from one generation to the next.

It’s great that Stiglitz has joined those of us who for decades have criticised marginal productivity theory. Institutional, political and social factors have an overwhelming influence on wages and the relative shares of labour and capital.

When a theory is impossible to reconcile with facts there is only one thing to do — scrap it!

Romer follows up his critique

21 September, 2016 at 22:14 | Posted in Economics | 2 Comments

head-in-sand-1The one reaction that puzzles me goes something like this: “Romer’s critique of RBC models is dated; we’ve known all along that those models make no sense.”

If we know that the RBC model makes no sense, why was it left as the core of the DSGE model? Those phlogiston shocks are still there. Now they are mixed together with a bunch of other made-up shocks.

Moreover, I see no reason to be confident about what we will learn if some econometrician adds sticky prices and then runs a horse to see if the shocks are more or less important than the sticky prices. The essence of the identification problem is that the data do not tell you who wins this kind of race. The econometrician picks the winner.

Paul Romer

Those of us in the economics community who have been unpolite enough to dare questioning the preferred methods and models applied in macroeconomics are as a rule met with disapproval. Although people seem to get very agitated and upset by the critique, defenders of ‘received theory’ always say that the critique is ‘nothing new,’ that they have always been ‘well aware’ of the problems, and so on, and so on.

So, for the benefit of all macroeconomists who, like Simon Wren-Lewis, don’t want to be disturbed in their doings — eminent mathematical statistician David Freedman has put together a very practical list of vacuous responses to criticism that can be freely used to save your peace of mind:

We know all that. Nothing is perfect … The assumptions are reasonable. The assumptions don’t matter. The assumptions are conservative. You can’t prove the assumptions are wrong. The biases will cancel. We can model the biases. We’re only doing what evereybody else does. Now we use more sophisticated techniques. If we don’t do it, someone else will. What would you do? The decision-maker has to be better off with us than without us … The models aren’t totally useless. You have to do the best you can with the data. You have to make assumptions in order to make progress. You have to give the models the benefit of the doubt. Where’s the harm?

Wren-Lewis trivializing Romer’s critique

20 September, 2016 at 22:11 | Posted in Economics | 1 Comment

As yours truly wrote last week, there has been much discussion going on in the economics academia on Paul Romer’s recent critique of ‘modern’ macroeconomics.

Now Oxford professor Simon Wren-Lewis has a blog post up arguing that Romer’s critique is

ostrich-headunfair and wide of the mark in places … Paul’s discussion of real effects from monetary policy, and the insistence on productivity shocks as business cycle drivers, is pretty dated … Yet it took a long time for RBC models to be replaced by New Keynesian models, and you will still see RBC models around. Elements of the New Classical counter revolution of the 1980s still persist in some places … The impression Paul Romer’s article gives, might just have been true in a few years in the 1980s before New Keynesian theory arrived. Since the 1990s New Keynesian theory is now the orthodoxy, and is used by central banks around the world.

Now this rather unsuccessful attempt to disarm the real force of Romer’s critique should come as no surprise for anyone who has been following Wren-Lewis’ writings over the years.

In a recent paper — Unravelling the New Classical Counter Revolution — Wren-Lewis writes approvingly about all the ‘impressive’ theoretical insights New Classical economics has brought to macroeconomics:

The theoretical insights that New Classical economists brought to the table were impressive: besides rational expectations, there was a rationalisation of permanent income and the life-cycle models using intertemporal optimisation, time inconsistency and more …

A new revolution, that replaces current methods with older ways of doing macroeconomics, seems unlikely and I would argue is also undesirable. The discipline does not need to advance one revolution at a time …

To understand modern academic macroeconomics, it is no longer essential that you start with The General Theory. It is far more important that you read Lucas and Sargent (1979), which is a central text in what is generally known as the New Classical Counter Revolution (NCCR). That gave birth to DSGE models and the microfoundations programme, which are central to mainstream macroeconomics today …

There’s something that just does not sit very well with this picture of modern macroeconomics.

‘Read Lucas and Sargent (1979)’. Yes, why not. That is exactly what Romer did!

One who has also read it is Wren-Lewis’s ‘New Keynesian’ buddy Paul Krugman. And this is what he has to say on that reading experience:

Lucas and his school … went even further down the equilibrium rabbit hole, notably with real business cycle theory. And here is where the kind of willful obscurantism Romer is after became the norm. I wrote last year about the remarkable failure of RBC theorists ever to offer an intuitive explanation of how their models work, which I at least hinted was willful:

“But the RBC theorists never seem to go there; it’s right into calibration and statistical moments, with never a break for intuition. And because they never do the simple version, they don’t realize (or at any rate don’t admit to themselves) how fundamentally silly the whole thing sounds, how much it’s at odds with lived experience.”

Paul Krugman

And so has Truman F. Bewley:

Lucas and Rapping (1969) claim that cyclical increases in unemployment occur when workers quit their jobs because wages or salaries fall below expectations …

According to this explanation, when wages are unusually low, people become unemployed in order to enjoy free time, substituting leisure for income at a time when they lose the least income …

According to the theory, quits into unemployment increase during recessions, whereas historically quits decrease sharply and roughly half of unremployed workers become jobless because they are laid off … During the recession I studied, people were even afraid to change jobs because new ones might prove unstable and lead to unemployment …

If wages and salaries hardly ever fall, the intertemporal substitution theory is widely applicable only if the unemployed prefer jobless leisure to continued employment at their old pay. However, the attitude and circumstances of the unemployed are not consistent with their having made this choice …

In real business cycle theory, unemployment is interpreted as leisure optimally selected by workers, as in the Lucas-Rapping model. It has proved difficult to construct business cycle models consistent with this assumption and with real wage fluctuations as small as they are in reality, relative to fluctuations in employment.

This is, of course, only what you would expect of New Classical Chicago economists.

So, what’s the problem?

The problem is that sadly enough this extraterrestial view of unemployment is actually shared by Wren-Lewis and other so called ‘New Keynesians’ — a school whose microfounded dynamic stochastic general equilibrium models cannot even incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfounded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

In the basic DSGE models used by most ‘New Keynesians’, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its ‘equilibrium value,’ the representative agent adjust her labour supply, so that when the real wage is higher than its ‘equilibrium value,’ labour supply is increased, and when the real wage is below its ‘equilibrium value,’ labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

To Wren-Lewis is seems as though the ‘New Keynesian’ acceptance of rational expectations, representative agents and microfounded DSGE models is something more or less self-evidently good. Not all economists (yours truly included) share that view:

While one can understand that some of the elements in DSGE models seem to appeal to Keynesians at first sight, after closer examination, these models are in fundamental contradiction to Post-Keynesian and even traditional Keynesian thinking. The DSGE model is a model in which output is determined in the labour market as in New Classical models and in which aggregate demand plays only a very secondary role, even in the short run.

In addition, given the fundamental philosophical problems presented for the use of DSGE models for policy simulation, namely the fact that a number of parameters used have completely implausible magnitudes and that the degree of freedom for different parameters is so large that DSGE models with fundamentally different parametrization (and therefore different policy conclusions) equally well produce time series which fit the real-world data, it is also very hard to understand why DSGE models have reached such a prominence in economic science in general.

Sebastian Dullien

Neither New Classical nor ‘New Keynesian’ microfounded DSGE macro models have helped us foresee, understand or craft solutions to the problems of today’s economies.

Wren-Lewis ultimately falls back on the same kind of models that he criticize, and it would sure be interesting to once hear him explain how silly assumptions like ‘hyperrationality’ and ‘representative agents’ help him work out the fundamentals of a truly relevant macroeconomic analysis.

In a recent paper on modern macroeconomics, another of Wren-Lewis’s ‘New Keynesian’ buddies, macroeconomist Greg Mankiw, wrote:

The real world of macroeconomic policymaking can be disheartening for those of us who have spent most of our careers in academia. The sad truth is that the macroeconomic research of the past three decades has had only minor impact on the practical analysis of monetary or fiscal policy. The explanation is not that economists in the policy arena are ignorant of recent developments. Quite the contrary: The staff of the Federal Reserve includes some of the best young Ph.D.’s, and the Council of Economic Advisers under both Democratic and Republican administrations draws talent from the nation’s top research universities. The fact that modern macroeconomic research is not widely used in practical policymaking is prima facie evidence that it is of little use for this purpose. The research may have been successful as a matter of science, but it has not contributed significantly to macroeconomic engineering.

So, then what is the raison d’être of macroeconomics, if it has nothing to say about the real world and the economic problems out there?

If macoeconomic models – no matter of what ilk – assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Macroeconomic theorists – regardless of being ‘New Monetarist’, ‘New Classical’ or ‘New Keynesian’ – ought to do some ontological reflection and heed Keynes’ warnings on using thought-models in economics:

The object of our analysis is, not to provide a machine, or method of blind manipulation, which will furnish an infallible answer, but to provide ourselves with an organized and orderly method of thinking out particular problems; and, after we have reached a provisional conclusion by isolating the complicating factors one by one, we then have to go back on ourselves and allow, as well as we can, for the probable interactions of the factors amongst themselves. This is the nature of economic thinking. Any other way of applying our formal principles of thought (without which, however, we shall be lost in the wood) will lead us into error.

So, these are some of my arguments for why I think that Simon Wren-Lewis ought to be even more critical of the present state of macroeconomics — including ‘New Keynesian’ macroeconomics  — than he is. If macroeconomic models – no matter of what ilk –  build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Trying to represent real-world target systems with models flagrantly at odds with reality is futile. And if those models are New Classical or ‘New Keynesian’ makes very little difference.

Fortunately — when you’ve got tired of the kind of macroeconomic apologetics produced by ‘New Keynesian’ macroeconomists like Wren-Lewis, Mankiw, and Krugman, there still are some real Keynesian macroeconomists to read. One of them — Axel Leijonhufvud — writes:

For many years now, the main alternative to Real Business Cycle Theory has been a somewhat loose cluster of models given the label of New Keynesian theory. New Keynesians adhere on the whole to the same DSGE modeling technology as RBC macroeconomists but differ in the extent to which they emphasise inflexibilities of prices or other contract terms as sources of shortterm adjustment problems in the economy. The “New Keynesian” label refers back to the “rigid wages” brand of Keynesian theory of 40 or 50 years ago. Except for this stress on inflexibilities this brand of contemporary macroeconomic theory has basically nothing Keynesian about it …

I conclude that dynamic stochastic general equilibrium theory has shown itself an intellectually bankrupt enterprise. But this does not mean that we should revert to the old Keynesian theory that preceded it (or adopt the New Keynesian theory that has tried to compete with it). What we need to learn from Keynes … are about how to view our responsibilities and how to approach our subject.

No matter how brilliantly silly ‘New Keynesian’ DSGE models central banks, Wren-Lewis, and his buddies come up with, they do not help us working with the fundamental issues of modern economies. Using that kind of models only confirms Robert Gordon‘s  dictum that today

rigor competes with relevance in macroeconomic and monetary theory, and in some lines of development macro and monetary theorists, like many of their colleagues in micro theory, seem to consider relevance to be more or less irrelevant.

Next Page »

Blog at WordPress.com.
Entries and comments feeds.