Please say after me – Sonnenschein-Mantel-Debreu

21 juli, 2012 kl. 10:53 | Publicerat i Economics | 23 kommentarer

Can you say Sonnenschein-Mantel-Debreu?

Good!

Because that probably also means that you can understand why New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and ”New Keynesian” microfounded macromodels are such bad substitutes for real macroeconomic analysis.

These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And – worse still – something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the Sonnenschein-Mantel-Debreu theorem points to. After all – as Nobel laureate Robert Solow noted in “The State of Macroeconomics” (Journal of Economic Perspectives 2008:243-249) – ”a modern economy is populated by consumers, workers, pensioners, owners, managers, investors, entrepreneurs, bankers, and others, with different and sometimes conflicting desires, information, expectations, capacities, beliefs, and rules of behavior.” So, representative agent models are rather an evasion whereby issues of distribution, coordination, heterogeneity – everything that really defines macroeconomics – are swept under the rug.

Conclusion – don’t believe a single thing of what these microfounders tell you until they have told you how they have coped with – not evadedSonnenschein-Mantel-Debreu!

Of course, most macroeconomists know that to use a representative agent is a flagrantly illegitimate method of ignoring real aggregation issues. They keep on with their business, nevertheless, just because it significantly simplifies what they are doing. It reminds – not so little – of the drunkard who has lost his keys in some dark place and deliberately chooses to look for them under a neighbouring street light just because it is easier to see there!

Fooled by randomness

20 juli, 2012 kl. 18:02 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för Fooled by randomness

A non-trivial part of teaching statistics to social science students is made up of learning them to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the probabilities generated by these statistical tests – p values – really are, still most students misinterpret them.

A couple of years ago I gave a statistics course for the Swedish National Research School in History, and at the exam I asked the students to explain how one should correctly interpret p-values. Although the correct definition is p(data|null hypothesis), a majority of the students either misinterpreted the p value as being the likelihood of a sampling error (which of course is wrong, since the very computation of the p value is based on the assumption that sampling errors are what causes the sample statistics not coinciding with the null hypothesis) or that the p value is the probability of the null hypothesis being true, given the data (which of course also is wrong, since it is p(null hypothesis|data) rather than the correct p(data|null hypothesis)).

This is not to blame on students’ ignorance, but rather on significance testing not being particularly transparent (conditional probability inference is difficult even to those of us who teach and practice it). A lot of researchers fall pray to the same mistakes. So – given that it anyway is very unlikely than any population parameter is exactly zero, and that contrary to assumption most samples in social science and economics are not random or having the right distributional shape – why continue to press students and researchers to do null hypothesis significance testing, testing that relies on weird backward logic that students and researchers usually don’t understand?

Let me just give a simple example to illustrate how slippery it is to deal with p-values – and how easy it is to impute causality to things that really are nothing but chance occurrences.

Say you have collected cross-country data on austerity policies and growth (and let’s assume that you have been able to ”control” for possible confounders). You find that countries that have implemented austerity policies have on average increased their growth by say 2% more than the other countries. To really feel sure about the efficacy of the austerity policies you run a significance test – thereby actually assuming without argument that all the values you have come from the same probability distribution – and you get a p-value of  less tha 0.05. Heureka! You’ve got a statistically significant value. The probability is less than 1/20 that you got this value out of pure stochastic randomness.

But wait a minute. There is – as you may have guessed – a snag. If you test austerity policies in enough many countries you will get a statistically significant result out of pure chance 5% of the time. So, really, there is nothing to get so excited about!

Statistical significance doesn’t say that something is important or true. And since there already are far better and more relevant testing that can be done (see e. g. here and  here), it is high time to give up on this statistical fetish and not continue to be fooled by randomness.

Macroeconomic calibration – or why it is difficult for economists to take their own subject seriously (wonkish)

20 juli, 2012 kl. 15:27 | Publicerat i Economics, Theory of Science & Methodology | 7 kommentarer

There are many kinds of useless economics held in high regard within mainstream economics establishment today . Few – if any – are less deserved than the macroeconomic theory – mostly connected with Nobel laureates Finn Kydland, Robert Lucas,  Edward Prescott and Thomas Sargent – called calibration.

In an interview by Seppo Honkapohja and Lee Evans (Macroeconomic Dynamics 2005, vol. 9) Thomas Sargent says:

Evans and Honkapohja: What were the profession’s most important responses to the Lucas Critique?

Sargent: There were two. The first and most optimistic response was complete rational expectations econometrics. A rational expectations equilibrium is a likelihood function. Maximize it.

Evans and Honkapohja: Why optimistic?

Sargent: You have to believe in your model to use the likelihood function. it provides a coherent way to estimate objects of interest (preferences, technologies, information sets, measurement processes) within the context of a trusted model.

Evans and Honkapohja: What was the second response?

Sargent: Various types of calibration. Calibration is less optimistic about what your theory can accomplish because you would only use it if you din’t fully trust your entire model, meaning that you think your model is partly misspecified or incompetely specified, or if you trusted someone else’s model and data set more than your own. My recollection is that Bob Lucas and Ed Prescott were initially very enthusiastic about rational expetations econometrics. After all, it simply involved imposing on ourselves the same high standards we had criticized the Keynesians for failing to live up to. But after about five years of doing likelihood ratio tests on rational expectations models, I recall Bob Lucas and Ed Prescott both telling me that those tests were rejecting too many good models. The idea of calibration is to ignore some of the probabilistic implications of your model but to retain others. Somehow, calibration was intended as a balanced response to professing that your model, although not correct, is still worthy as a vehicle for quantitative policy analysis….

Evans and Honkapohja: Do you think calibration in macroeconomics was an advance?

Sargent: In many ways, yes. I view it as a constructive response to Bob’ remark that ”your likelihood ratio tests are rejecting too many good models”. In those days… there was a danger that skeptics and opponents would misread those likelihood ratio tests as rejections of an entire class of models, which of course they were not…. The unstated cse for calibration was that it was a way to continue the process of acquiring experience in matching rational expectations models to data by lowering our standards relative to maximum likelihood, and emphasizing those features of the data that our models could capture. Instead of trumpeting their failures in terms of dismal likelihood ratio statistics, celebrate the featuers that they could capture and focus attention on the next unexplained feature that ought to be explained. One can argue that this was a sensible response… a sequential plan of attack: let’s first devote resources to learning how to create a range of compelling equilibrium models to incorporate interesting mechanisms. We’ll be careful about the estimation in later years when we have mastered the modelling technology…

But is the Lucas-Kydland-Prescott-Sargent calibration really an advance?

Let’s see what two eminent econometricians have to say. In Journal of Economic Perspective (1996, vol. 10) Lars Peter Hansen and James J. Hickman writes:

It is only under very special circumstances that a micro parameter such as the intertemporal elasticity of substitution or even a marginal propensity to consume out of income can be ‘plugged into’ a representative consumer model to produce an empirically concordant aggregate model … What credibility should we attach to numbers produced from their ‘computational experiments’, and why should we use their ‘calibrated models’ as a basis for serious quantitative policy evaluation? … There is no filing cabinet full of robust micro estimats ready to use in calibrating dynamic stochastic equilibrium models … The justification for what is called ‘calibration’ is vague and confusing.

This is the view of econometric methodologist Kevin Hoover :

The calibration methodology, to date,  lacks any discipline as stern as that imposed by econometric methods.

And this is the verdict of Nobel laureate Paul Krugman :

The point is that if you have a conceptual model of some aspect of the world, which you know is at best an approximation, it’s OK to see what that model would say if you tried to make it numerically realistic in some dimensions.

But doing this gives you very little help in deciding whether you are more or less on the right analytical track. I was going to say no help, but it is true that a calibration exercise is informative when it fails: if there’s no way to squeeze the relevant data into your model, or the calibrated model makes predictions that you know on other grounds are ludicrous, something was gained. But no way is calibration a substitute for actual econometrics that tests your view about how the world works.

In physics it may possibly not be straining credulity too much to model processes as ergodic – where time and history do not really matter – but in social and historical sciences it is obviously ridiculous. If societies and economies were ergodic worlds, why do econometricians fervently discuss things such as structural breaks and regime shifts? That they do is an indication of the unrealisticness of treating open systems as analyzable with ergodic concepts.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. Reading Sargent and other calibrationists one comes to think of Robert Clower’s apt remark that

much economics is so far removed from anything that remotely resembles the real world that it’s often difficult for economists to take their own subject seriously.

Instead of assuming calibration and rational expectations to be right, one ought to confront the hypothesis with the available evidence. It is not enough to construct models. Anyone can construct models. To be seriously interesting, models have to come with an aim. They have to have an intended use. If the intention of calibration and rational expecteations  is to help us explain real economies, it has to be evaluated from that perspective. A model or hypothesis without a specific applicability is not really deserving our interest.

To say, as Edward Prescott that

one can only test if some theory, whether it incorporates rational expectations or, for that matter, irrational expectations, is or is not consistent with observations

is not enough. Without strong evidence all kinds of absurd claims and nonsense may pretend to be science. We have to demand more of a justification than this rather watered-down version of “anything goes” when it comes to rationality postulates. If one proposes rational expectatons one also has to support its underlying assumptions. None is given, which makes it rather puzzling how rational expectations has become the standard modeling assumption made in much of modern macroeconomics. Perhaps the reason is, as Paul Krugman has it, that economists often mistake

beauty, clad in impressive looking mathematics, for truth.

But I think Prescott’s view is also the reason why calibration economists are not particularly interested in empirical examinations of how real choices and decisions are made in real economies. In the hands of Lucas, Prescott and Sargent rational expectations has been transformed from an – in principle – testable hypothesis to an irrefutable proposition. Irrefutable propositions may be comfortable – like religious convictions or ideological dogmas – but it is not  science.

On useless economics and academic incentives (wonkish)

20 juli, 2012 kl. 10:24 | Publicerat i Economics, Theory of Science & Methodology | 7 kommentarer

Paul Krugman has a great post up on his blog today, where he gets – almost – every thing right on the state of mainstream economics academia of today:

To put it bluntly: faced with a severe economic crisis — the very kind of crisis macroeconomics was created to deal with — it often seems as if the profession is determined to make itself useless.

Wren-Lewis’s first post concerns the obsession with microfoundations. As he says, this obsession is at this point deeply embedded in the academic incentive structure:

”If you think that only ‘modelling what you can microfound’ is so obviously wrong that it cannot possibly be defended, you obviously have never had a referee’s report which rejected your paper because one of your modelling choices had ‘no clear microfoundations’. One of the most depressing conversations I have is with bright young macroeconomists who say they would love to explore some interesting real world phenomenon, but will not do so because its microfoundations are unclear.”

So where does this come from? The “Lucas critique” has been a big deal in the profession for more than a generation. This says that even if you observe a particular relationship in the real world — say, a relationship between inflation and unemployment — this relationship may change when policy changes. So you really want to have a deeper understanding of where the relationship comes from — “microfoundations” — so that you won’t be caught off guard if it does change in response to policy.

And this is fair enough. But what if you have an observed fact about the world — say, downward wage rigidity — that you can’t easily derive from first principles, but seems to be robust in practice? You might think that the right response is to operate on the provisional assumption that this relationship will continue to hold, rather than simply assume it away because it isn’t properly microfounded — and you’d be right, in my view. But the profession, at least in its academic wing, has largely chosen to take the opposite tack, insisting that if it isn’t microfounded — and with all i’s dotted and t’s crossed, no less — then it’s not publishable or, in the end, thinkable.

Now we’re having a crisis that makes perfect sense if you’re willing to accept some real-world behavior that doesn’t arise from intertemporal maxiimization, but none at all if you aren’t — and to a large extent the academic macroeconomics profession has absented itself from useful discussion.

In the second post Wren-Lewis responds to another tired attack on fiscal stimulus, based on basically nothing. As he says, it’s hard to imagine a clearer case for action than what we’re seeing: overwhelming evidence that fiscal policy does in fact work, zero real interest rates. Yet a substantial number of economists seem determined to find reasons not to act. Some of this is ideology, but I suspect that part of this also represents a carryover from academic careerism, where differentiating your product — claiming that the big guys are wrong at something — is part of what you do to get noticed. This kind of petty stuff doesn’t matter when it’s just academic games, but when it clouds the discussion in the face of mass unemployment, it becomes very bad indeed.

My bottom line is that we as a profession faced the crucial test of our lives — and by and large we failed and continue to fail. It’s not a happy story.

Although I think this is great – and brave, since what Krugman (and Wren-Lewis) is admitting is something we all know is a fact, but few are willing to air publicly – I would like two make two comments.

First: The microfoundational program that sprung out of the ”Lucas critique” doesn’t deliver a ”deeper understanding” of stable, fundamental relationships in the economy, and so is, at least to me, not possible to characterize as ”fair enough.”

Let me elaborate a little.

Neoclassical economic theory today is in the story-telling business whereby economic theorists create make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models”.

But how do we bridge the gulf between model and reality? According to Lucas we have to be willing to argue by ”analogy from what we know” to what we would like to know. Progress lies in the pursuit of the ambition to “tell better and better stories.”

If the goal of theory is to be able to make accurate forecasts, the ability of a model to imitate actual behavior does not give much leverage. What is required is according to the ”Lucas critique” is some kind of invariance of the model’s structure under policy variations. Parametric invariance in an economic model cannot be taken for granted, but to Lucas it seems reasonable to hope that neither tastes nor technology ”vary systematically.”

The model should enable us to posit contrafactual questions about what would happen if some variable was to change in a specific way. Hence the assumption of structural invariance, that purportedly enables the theoretical economist to do just that. But does it? Lucas appeals to “reasonable hope”, a rather weak justification for a modeler to apply such a far-reaching assumption. To warrant it one would expect an argumentation that this assumption – whether we conceive of it as part of a strategy of isolation, idealization or successive approximation – really establishes a useful relation that we can export or bridge to the target system, the actual economy.

The basic assumption of this “precise and rigorous” model therefore cannot be considered anything else than an unsubstantiated conjecture as long as it is not supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence have been presented. This is the more tantalizing since Lucas himself stresses that the presumption must be defended on empirical grounds.

And applying a “Lucas critique” on Lucas own model, it is obvious that it too fails. For example, changing “policy rules” cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy. Lucas hope of being able to model the economy as “a FORTRAN program” therefore seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in the alleged stable deep parameters “tastes” and “technology” shows that if you neglect ontological considerations pertaining to the target system, ultimately reality kicks back when at last questions of bridging and exportation of model exercises are laid on the table. No matter how precise and rigorous the analysis is, and no matter how hard one tries to cast the argument in modern mathematical form they do not push science forwards one millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about external validity.

Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality.

Second: It is rather typical – sad to say, but true – that Krugman once again deliberately fails to mention that heterodox economists – many of whom, as yours truly, are Post Keynesians – haven’t succumbed to the microfoundational plague and that they have been enormously successful in both predicting and explaining the financial crisis that haunts us today. And, from own experience, I can assure you that we have almost insurmountable problems getting things published in the major economics journals – journals run by the hegemonic mainstream neoclassical establishment.

Listen to Larry, Greg!

19 juli, 2012 kl. 19:04 | Publicerat i Economics, Politics & Society | Kommentarer inaktiverade för Listen to Larry, Greg!


Lawrence Summers listening to Greg Mankiw’s explications on inequality?

Even though the interest may not be reciprocated,  it would obviously be a good idea for Greg Mankiw to listen to his Harvard colleague Lawrence Summers, instead of trivializing the problems created by increasing inequality! Summers has some interesting  thoughts on why income inequality is on the rise and what to do about it:

Why has the top 1 per cent of the population done so well relative to the rest? The answer probably lies substantially in changing technology and globalisation. When George Eastman revolutionised photography, he did very well and, because he needed a large number of Americans to carry out his vision, the city of Rochester had a thriving middle class for two generations. By contrast, when Steve Jobs revolutionised personal computing, he and the shareholders in Apple (who are spread all over the world) did very well but a much smaller benefit flowed to middle-class American workers both because production was outsourced and because the production of computers and software was not terribly labour intensive …

What then is the right response to rising inequality? There are too few good ideas in current political discourse and the development of better ones is crucial. Here are three.

First, government must be careful that it does not facilitate increases in inequality by rewarding the wealthy with special concessions. Where governments dispose of assets or allocate licences, there is a compelling case for more use of auctions to which all have access. Where government provides insurance implicitly or explicitly, premiums must be set as much as possible on a market basis rather than in consultation with the affected industry. A general posture for government of standing up for capitalism rather than particular well-connected capitalists would also serve to mitigate inequality.

Second, there is scope for pro-fairness, pro-growth tax reform. When there are more and more great fortunes being created and the government is in larger and larger deficit, it is hardly a time for the estate tax to be eviscerated. With smaller families and ever more bifurcation in the investment opportunities open to those with wealth, there is a real risk that the old notion of “shirtsleeves to shirtsleeves in three generations” will become obsolete, and those with wealth will endow dynasties.

Third, the public sector must insure that there is greater equity in areas of the most fundamental importance. It will always be the case in a market economy that some will have mansions, art and the ability to travel in lavish fashion. What is more troubling is that the ability of the children of middle-class families to attend college has been seriously compromised by increasing tuition fees and sharp cutbacks at public universities and colleges.

At the same time, in many parts of the country a gap has opened between the quality of the private school education offered to the children of the rich and the public school educations enjoyed by everyone else. Most alarming is the near doubling over the last generation in the gap between the life expectancy of the affluent and the ordinary.

Neither the politics of polarisation nor those of noblesse oblige will serve to protect the interests of the middle class in the post-industrial economy. We will have to find ways to do better.

Greg Mankiw and Richard Epstein – libertarian mumbo jumbo

19 juli, 2012 kl. 15:51 | Publicerat i Economics, Politics & Society | 7 kommentarer

As yours truly has commented on earlier, walked-out Harvard economist and George Bush advisor Greg Mankiw  is having problems with explaining the rising inequality we have seen for the last 30 years in both the US and elsewhere in Western societies. Mankiw writes:

Even if the income gains are in the top 1 percent, why does that imply that the right story is not about education?

I then realized that Paul is making an implicit assumption–that the return to education is deterministic. If indeed a year of schooling guaranteed you precisely a 10 percent increase in earnings, then there is no way increasing education by a few years could move you from the middle class to the top 1 percent.

But it may be better to think of the return to education as stochastic. Education not only increases the average income a person will earn, but it also changes the entire distribution of possible life outcomes. It does not guarantee that a person will end up in the top 1 percent, but it increases the likelihood. I have not seen any data on this, but I am willing to bet that the top 1 percent are more educated than the average American; while their education did not ensure their economic success, it played a role.

This is, of course, nothing but really one big evasive action trying to explain away a very disturbing structural shift that has taken place in our societies. And change that has very little to do with stochastic returns to education. Those were in place also 30 or 40 years ago. At that time they meant that perhaps a CEO earned 10-12 times what ”ordinary” people earns. Today it means that they perhaps earn 100-200 times  what ”ordinary” people earns. A question of education? No way! It is a question of greed and a lost sense of a common project of building a sustainable society. A result of stochastic returns to education? No, this has to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Mankiw has stubbornly refused to nudge on his libertarian stance on this issue. So, rather consistently, he links on his blog to a PBS-interview with libertarian professor of law, Richard Epstein:

RICHARD EPSTEIN: What’s good about inequality is if, in fact, it turns out that inequality creates an incentive for people to produce and to create wealth, it’s a wonderful force for innovation.

PAUL SOLMAN: Aren’t many of the top 1 percent or 0.1 percent in this country rich because they’re in finance?

RICHARD EPSTEIN: Yes. Many of the very richest people in the United States are rich because they are in finance.

And one of the things you have to ask is, why is anyone prepared to pay them huge sums of money if in fact they perform nothing of social value? And the answer is that when you try to knock out the financiers, what you do is you destroy the liquidity of capital markets. And when you destroy the liquidity of those markets, you make it impossible for businesses to invest, you make it impossible for people to buy home mortgages and so forth, and all sorts of other breakdowns.

So they should be rich. It doesn’t bother me.

PAUL SOLMAN: Are you worried that a small number of people controlling a disproportionate share of the wealth can control a democratic system?

RICHARD EPSTEIN: Oh, my God no.

Mankiw does not in any way comment on Epstein’s amazing stupidities or gives us a hint of why he has chosen to link to the interview. But sometimes silence perhaps says more than a thousand words …

Now, compare that mumbo jumbo with what a true liberal has to say on the issue:

The outstanding faults of the economic society in which we live are its failure to provide for full employment and its arbitrary and inequitable distribution of wealth and incomes … I believe that there is social and psychological justification for significant inequalities of income and wealth, but not for such large disparities as exist to-day.

John Maynard Keynes wrote this in General Theory (1936). Seventy five years later it looks like this in the UK, the US and Sweden:

Source: The Top Incomes Database and own calculations

Nobel laureate Joseph Stiglitz has some very interesting thoughts – in Vanity Fair  – on what this increasing economic inequality do to our societies:

Some people look at income inequality and shrug their shoulders. So what if this person gains and that person loses? What matters, they argue, is not how the pie is divided but the size of the pie. That argument is fundamentally wrong. An economy in which most citizens are doing worse year after year—an economy like America’s—is not likely to do well over the long haul … Perhaps most important, a modern economy requires “collective action”—it needs government to invest in infrastructure, education, and technology … The more divided a society becomes in terms of wealth, the more reluctant the wealthy become to spend money on common needs … America’s inequality distorts our society in every conceivable way. There is, for one thing, a well-documented lifestyle effect—people outside the top 1 percent increasingly live beyond their means. Trickle-down economics may be a chimera, but trickle-down behaviorism is very real … Of all the costs imposed on our society by the top 1 percent, perhaps the greatest is this: the erosion of our sense of identity, in which fair play, equality of opportunity, and a sense of community are so important.

A society where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implodes. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed. It’s high time to put an end to this the worst Juggernaut of  our time!

Models and successive approximations in economics

19 juli, 2012 kl. 13:01 | Publicerat i Economics, Theory of Science & Methodology | 9 kommentarer

The ongoing debate on ”modern” microfounded macroeconomics raises som very interesting philosophical-methodological issues.

Macroeconomist Simon Wren-Lewis writes on his blog (emphasis added):

Stuff like we cannot possibly take microfounded macro seriously, because it is based on an all-embracing representative actor equipped with superhuman knowledge and forecasting abilities. To which I feel like shouting – where else do you start? I always say to PhD students, start simple, understand the simple model, and then complicate. So we start with a representative agent. What else could we do?

And in another post:

As an intellectual exercise, the ‘model what you can microfound’ approach can be informative. Hopefully it is also a stepping stone on the way to being able to explain what you see.

What these citations well illustrate is the idea of science advancing through the use of successive approximations.  Is this really a feasible methodology? Let me elaborate a little on why I think not.

Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world).  All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built assumptions – like ”rational expectations” or ”representative actors” – made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But models do not only face theory. They also have to look to the world. Being able to model a ”credible world,” a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

One could of course also ask for robustness  [In a response to a piece on his blog, Wren-Lewis writes: ”The paradox of thrift was not based on a microfounded model. But the fact that you can also get it from a microfounded model makes me much more confident that its a robust result!”]  but the ”credible world,” even after having tested it for robustness, can still be a far way from reality – and unfortunately often in ways we know are important. Robustness of claims in a model does not per se give a warrant for exporting the claims to real world target systems.

Anyway, robust theorems are exceedingly rare or non-existent in macroeconomics. Explanation, understanding and prediction of real world phenomena, relations and mechanisms therefore cannot be grounded (solely) on robustness analysis. Some of the standard assumptions made in neoclassical economic theory – on rationality, information handling and types of uncertainty – are not possible to make more realistic by de-idealization or successive approximations without altering the theory and its models fundamentally.

If we cannot show that the mechanisms or causes we isolate and handle in our models are stable, in the sense that what when we export them from are models to our target systems they do not change from one situation to another, then they only hold under ceteris paribus conditions and a fortiori are of limited value for our understanding, explanation and prediction of our real world target system.

The obvious ontological shortcoming of a basically epistemic – rather than ontological – approach such as ”successive approximations” is that “similarity” or “resemblance” tout court do not guarantee that the correspondence between model and target is interesting, relevant, revealing or somehow adequate in terms of mechanisms, causal powers, capacities or tendencies. No matter how many convoluted refinements of concepts made in the model, if the ”successive approximations” do not result in models similar to reality in the appropriate respects (such as structure, isomorphism etc), the surrogate system becomes a substitute system that does not bridge to the world but rather misses its target.

So, I have to conclude that constructing “minimal macroeconomic models” or using microfounded macroeconomic models as “stylized facts” or “stylized pictures” somehow “successively approximating” macroeconomic reality, is a rather unimpressive attempt at legitimizing using fictitious idealizations for reasons more to do with model tractability than with a genuine interest of understanding and explaining features of real economies. Many of the model assumptions standardly made by neoclassical macroeconomics are restrictive rather than harmless and could a fortiori anyway not in any sensible meaning be considered approximations at all.

On rational expectations and the communism of macroeconomic models

19 juli, 2012 kl. 10:59 | Publicerat i Economics, Theory of Science & Methodology | Kommentarer inaktiverade för On rational expectations and the communism of macroeconomic models

Professor John Kay has a  marvelous article in  Financial Times on why ”modern” macroeconomics – based on ”rational expectations” and ”representative actors” – fails. Kay writes:

Prof Sargent and colleagues appropriated the term “rational expectations” for their answer. Suppose the economic world evolves according to some predetermined model, in which uncertainties are “known unknowns” that can be described by probability distributions. Then economists could gradually deduce the properties of this model, and businesses and individuals would naturally form expectations in that light. If they did not, they would be missing obvious opportunities for advantage.

This approach, which postulates a universal explanation into which economists have privileged insight, was as influential as it was superficially attractive. But a scientific idea is not seminal because it influences the research agenda of PhD students. An important scientific advance yields conclusions that differ from those derived from other theories, and establishes that these divergent conclusions are supported by observation. Yet as Prof Sargent disarmingly observed, “such empirical tests were rejecting too many good models” in the programme he had established with fellow Nobel laureates Bob Lucas and Ed Prescott. In their world, the validity of a theory is demonstrated if, after the event, and often with torturing of data and ad hoc adjustments that are usually called “imperfections”, it can be reconciled with already known facts – “calibrated”. Since almost everything can be “explained” in this way, the theory is indeed universal; no other approach is necessary, or even admissible. Asked “do you think that differences among people’s models are important aspects of macroeconomic policy debates”, Prof Sargent replied: “The fact is you simply cannot talk about their differences within the typical rational expectations model. There is a communism of models. All agents within the model, the econometricians, and God share the same model.”

Rational expectations consequently fail for the same reason communism failed – the arrogance and ignorance of the monopolist. In their critique of rational expectations, Roman Frydman and Michael Goldberg employ Hayek’s critique of planning; the market economy, unlike communism, can mediate different perceptions of the world, bringing together knowledge whose totality is not held by anyone. God did not vouchsafe his model to us, mortals see the present imperfectly and the future dimly, and use many different models. Some agents made profits, some losses, and the financial crisis of 2007-08 decided which was which. Only Prof Sargent’s econometricians were wedded to a single model and could, as usual, explain the crisis only after it had occurred. For them, the crisis was a random shock, but the occasion for a Nobel prize.

One might perhaps find it odd to juxtapose God and people, but as Leonard Rapping said (in Arjo Klamer, The New Classical Macroeconomics 1984, p 234):

Frankly, I do not think that the rational expectations theorists are in the real world. Their approach is much to abstract.

The first microfounded macroeconomist

18 juli, 2012 kl. 21:10 | Publicerat i Varia | Kommentarer inaktiverade för The first microfounded macroeconomist

Straight from the horse’s mouth on ”rational expectations”

18 juli, 2012 kl. 20:23 | Publicerat i Economics | 1 kommentar

As we have seen, Oxford professor Simon Wren-Lewis undauntedly goes on defending representative actors and rational expectations models in the microfoundations programme for macroeconomics on his blog.

It may perhaps be interesting to listen to how Mr Rational Expectations himself – Nobel laureate Robert Lucas – values these assumptions in the aftermath of the latest financial crisis:

Kevin Hoover: The Great Recession and the recent financial crisis have been widely viewed in both popular and professional commentary as a challenge to rational expectations and to efficient markets … I’m asking you whether you accept any of the blame … there’s been a lot of talk about whether rational expectations and the efficient-markets hypotheses is where we should locate the analytical problems that made us blind.

Robert Lucas: You know, people had no trouble having financial meltdowns in their economies before all this stuff we’ve been talking about came on board. We didn’t help, though; there’s no question about that. We may have focused attention on the wrong things, I don’t know.

Source

Krugman and the Swedish model

18 juli, 2012 kl. 12:04 | Publicerat i Economics, Politics & Society | 8 kommentarer

In a post on Sweden and the rising inequality Paul Krugman writes:

[Y]ou have no business talking about international income distribution if you don’t know about the invaluable World Top Incomes Database. What does this database tell us about Sweden versus America?

DESCRIPTION

Hey, it looks just the same — or, actually, not.

Yes, the top one percent has risen a bit in Sweden. But how anyone could look at this and see the story as similar boggles the mind.

It is not that often that yours truly disagree with Krugman on empirical matters, but here I think he is wrong. It is indeed possible to see the story as similar.

Why? Look at the graphs below:

The average annual percentage growth rate 1981-2007 was 2.1% in Sweden (in UK  and in the US: 2.9%). To me that is an indication that Sweden is also experiencing growing inequality to a notable extent.

Also look at this plot (based on data  from The Top Incomes Database):

During the last sixty years the top income shares in Sweden, the United-Kingdom and the United States have developed like this:

            Source: The Top Incomes Database

And look at the figure below,  which shows how the distribution of mean income and wealth (expressed in year 2009 prices) for the top 0.1% and the bottom 90% has changed in Sweden for the last 30 years:


Source: The World Top Incomes Database

I would say the development in Sweden is also deeply problematic and going in the wrong direction. The main difference compared to UK and US is really that the increasing inequality in Sweden (going on continuously for 30 years now) started from a lower starting point.
The rising inequality has probably to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite – in Sweden as well as in the UK and the US.

And as if this wasn’t enough, Paul Krugman has another post up on Sweden where he still doesn’t get it quite right. This is what he writes:

I learn from Ezra Klein’s interview with Tom Coburn that Sweden, of all places, has become the new right-wing icon. I thought Europe’s woes were all about collapsing welfare states? But anyway, the story now is that Sweden has slashed spending and cut taxes, and is doing great; supply-side economics vindicated!

Ezra points out, rightly, that Sweden has actually benefited a lot from very aggressive monetary policy — one of the original Princeton zero-lower-bound Group of Four, Lars Svensson, is now deputy governor of the Riksbank. (The others were Mike Woodford, yours truly, and a fellow by the name of Ben Bernanke).

But Ezra didn’t challenge Coburn on the claim about spending cuts; why don’t we look at what Sweden has actually done, as opposed to the official right-wing line? Look, in particular, at actual government consumption — purchases of stuff. Here’s Sweden versus the United States, from Eurostat:

Somebody has been practicing harsh spending-side austerity — and it’s not Sweden.

But that’s far from the whole truth!

The namned Group of Four member, Lars E. O. Svensson, shows in a recent article – ”The Possible Unemployment Cost of Average Inflation below a Credible Target” – that the Swedish Riksbank during the years 1998-2011 has been pursuing a policy that in reality has made inflation on average 0.6 percentage units lower than the goal set by the Riksbank. The Phillips Curve he estimates shows that unemployment as a result of this overly ”austere” inflation level has been almost 1% higher than if one had stuck to the set inflation goal of 2%.

What Svensson is saying, without so many words, is that the Swedish Fed for no reason at all has made people unemployed. As a consequence of a faulty monetary policy the unemployment is considerably higher than it would have been if the Swedish Fed had done its job adequately.

So, I’m sorry Paul, but Sweden is no longer the model country it once was, in the heydays of the 60s and 70s. It’s no longer the country of Olof Palme. Just as in your own country, neoliberal ideologies, economists and politicians have crushed the Swedish dream that once was. It’s sad. But it’s a fact. And facts – as Gunnar Myrdal used to say – kick!

”Modern” macroeconomics hasn’t delivered

18 juli, 2012 kl. 09:46 | Publicerat i Economics, Theory of Science & Methodology | 3 kommentarer

Jonathan Schlefer, research associate at Harvard Business School, has written a new book – The Assumptions Economists Make – on what ”modern” macroeconomics has delivered during the last few decades. Although Schlefer shares many economists’ instrumentalist view on theories and models, he can’t really see that they have delivered what they promised – useful predictions.
Justin Fox summarizes:

By that standard, here are Schlefer’s judgments on the succession of theories that have dominated academic macroeconomics since the 1970s:

Rational expectations (which proposed that we’re all too smart to be fooled by money-printing central bankers and deficit-spending governments): Intellectually interesting, and maybe helpful in ”normal times,” whatever those are. But not very good at describing or predicting the actual behavior of the economy at any time, and worthless in a crisis.

Real business-cycle theory (which says that economic ups and downs are all caused by technology-induced changes in productivity): ”[N]ot only are these models a tautology — they are a tautology that turns out to be wrong. They say that employment rises or falls because actors choose to work more when productivity is high and less when it’s low. This is nuts.”

DSGE (sometimes called ”New Keynesian”) models: Not ”quite as bad as they sound,” as they do describe an economy that moves along by fits and starts. They just don’t leave room for any crazy stuff.

My blog is skyrocketing!

17 juli, 2012 kl. 20:22 | Publicerat i Varia | 3 kommentarer

About a year ago yours truly launched this blog.

A blog is sure not a beauty contest. But to my unfeigned joy the number of visitors has increased steadily. From about 300 visitors per week last spring, I’m now having almost 3 000 visitors – per day!

Given the rather ”wonkish” character of the blog – with posts mostly on economic theory, statistics, econometrics, theory of science and methodology – I have to admit of being somewhat amazed that so many are interested and take their time to read and comment on it.

I am – of course – truly honoured and delighted that the interest for the blog is steadily growing and that so  many readers contribute to it with their comments!

Gods and idiots may share Wren-Lewis model, but it certainly isn’t my model!

17 juli, 2012 kl. 17:55 | Publicerat i Economics, Theory of Science & Methodology | 22 kommentarer

In his latest piece on the ”The Great Divide” debate, Oxford professor Simon Wren-Lewis today writes:

There is a lot that I have read which is challenging, and which has made me think about things in different (for me) ways, which is good. But there is also lots of stuff that seems less helpful …

Stuff like we cannot possibly take microfounded macro seriously, because it is based on an all-embracing representative actor equipped with superhuman knowledge and forecasting abilities. To which I feel like shouting – where else do you start? I always say to PhD students, start simple, understand the simple model, and then complicate. So we start with a representative agent …

What about superhuman knowledge and forecasting abilities? That seems like an extreme position. But the alternative is to assume we know what kind of mistakes agents will make (sic!). Where does this knowledge come from? … To keep things simple, I therefore assume I do not know what mistakes they will make, which implies (sic!) rational expectations.

Being one of those annoying heterodox economists forcing Wren-Lewis and other neoclassical economists to ”think about things in different ways,” yours truly comes to think about a laboratory experiment run by James Andreoni and Tymofiy Mylovanov – presented here – where the researchers induced common probability priors, and then told all participants of the actions taken by the others. Their findings is very interesting, and says something rather profound on the value of the rational expectations hypothesis in the kind of models used by Wren-Lewis and other macroeconomists of the same ilk:

We look at choices in round 1, when individuals should still maintain common priors, being indifferent about the true state. Nonetheless, we see that about 20% of the sample erroneously disagrees and favors one point of view. Moreover, while other errors tend to diminish as the experiment progresses, the fraction making this type of error is nearly constant. One may interpret disagreement in this case as evidence of erroneous or nonrational choices.

Next, we look at the final round where information about disagreement is made public and, under common knowledge of rationality, should be sufficient to eliminate disagreement. Here we find that individuals weigh their own information more than twice that of the five others in their group. When we look separately at those who err by disagreeing in round 1, we find that these people weigh their own information more than 10 times that of others, putting virtually no stock in public information. This indicates a different type of error, that is, a failure of some individuals to learn from each other. This error is quite large and for a nontrivial minority of the population.

Setting aside the subjects who make systematic errors, we find that individuals still put 50% more weight on their own information than they do on the information revealed through the actions of others, although this difference is not statistically significant.

So in this experiment there seems to be some irrational idiots who don’t understand that that is exactly what they are. When told that the earth is flat they still adhere to their own beliefs of a circular earth. It is as if people thought that the probability that all others are idiots with irrational beliefs is higher than the probability that the earth is circular.

Now compare these experimental results with rational expectations models, where the world evolves in accordance with fully predetermined models where uncertainty has been reduced to stochastic risk describable by some probabilistic distribution.

The tiny little problem that there is no hard empirical evidence that verifies these models doesn’t seem to bother its protagonists too much. When asked in an interview by George Evans and Seppo Honkapohja (Macroeconomic Dynamics (2005, vol.9, 561-583) if he thought ”that differences among people’s models are important aspects of macroeconomic policy debates”, Nobel laureate Thomas Sargent replied:

The fact is you simply cannot talk about their differences within the typical rational expectations model. There is a communism of models. All agents within the model, the econometricians, and God share the same model.

One might perhaps find it odd to juxtapose God and people, but I think Leonard Rapping – himself a former rational expectationist – was on the right track (Arjo Klamer, The New Classical Macroeconomics 1984, p 234):

Frankly, I do not think that the rational expectations theorists are in the real world. Their approach is much to abstract.

Building models on rational expectations either means we are Gods or Idiots. Most of us know we are neither. So, God may share Sargent’s and Wren-Lewis’s model, but it certainly isn’t my model.

Economics quote of the century

17 juli, 2012 kl. 10:42 | Publicerat i Economics | 9 kommentarer

Macroeconomics was born as a distinct field in the 1940s (sic!), as a part of the intellectual response to the Great Depression. The term then referred to the body of knowledge and expertise that we hoped would prevent the recurrence of that economic disaster. My thesis in this lecture is that macroeconomics in this original sense has succeeded: Its central problem of depression-prevention has been solved, for all practical purposes, and has in fact been solved for many decades.

Robert Lucas (2003)

Economists – the Politburo of our time?

16 juli, 2012 kl. 18:35 | Publicerat i Economics | 1 kommentar

James Galbraith has a really interesting piece on today’s economic profession. Starting with a citation from an article he wrote back in 2000, Galbraith writes:

Leading active members of today’s economics profession… have formed themselves into a kind of Politburo for correct economic thinking. As a general rule – as one might generally expect from a gentleman’s club – this has placed them on the wrong side of every important policy issue, and not just recently but for decades. They predict disaster where none occurs. They deny the possibility of events that then happen. … They oppose the most basic, decent and sensible reforms, while offering placebos instead. They are always surprised when something untoward (like a recession) actually occurs. And when finally they sense that some position cannot be sustained, they do not re-examine their ideas. They do not consider the possibility of a flaw in logic or theory. Rather, they simply change the subject. No one loses face, in this club, for having been wrong. No one is dis-invited from presenting papers at later annual meetings. And still less is anyone from the outside invited in.

This remains the essential problem. As I have documented – and only in part – there is a rich and promising body of economics – theory and evidence – entirely suited to the study of financial crisis and its enormous problems. This work is significant in ways in which the entire corpus of mainstream economics – and including recent fashions like the new “behavioral economics” is not. And it brings great clarity to thinking about the implications of the Great Crisis through which we are still passing today. But where is it, inside the economics profession?Essentially, nowhere.

It is therefore pointless to continue with conversations centered on the conventional economics, futile to keep on arguing with Tweedledum and Tweedledee. The urgent need is instead to expand the academic space and the public visibility of ongoing work that is of actual value when faced with the many deep problems of economic life in our time. The urgent task is to make possible careers in those areas, and for people with those perspectives, that have been proven worthy by events. The followers of John Kenneth Galbraith, of Hyman Minsky and of Wynne Godley can claim this distinction. The task now is to increase their numbers and to reward their work.

The nodal point of the macroeconomics debate

16 juli, 2012 kl. 14:28 | Publicerat i Economics, Theory of Science & Methodology | 3 kommentarer

This summer both Oxford professor Simon Wren-Lewis and Nobel laureate Paul Krugman have had interesting posts up discussing modern macroeconomics and its alleged needs of microfoundations.

Most ”modern” mainstream neoclassical macroeonomists more or less subscribe to the view that microfoundations somehow has lead to better models enabling us to make better predictions of future macroeconomic events.

Both Wren-Lewis and Krugman are somewhat more sceptical vis-à-vis these expectations.

Wren-Lewis writes:

[S]uppose there is in fact more than one valid microfoundation for a particular aggregate model. In other words, there is not just one, but perhaps a variety of particular worlds which would lead to this set of aggregate macro relationships. (We could use an analogy, and say that these microfoundations were observationally equivalent in aggregate terms.) Furthermore, suppose that more than one of these particular worlds was a reasonable representation of reality. (Among this set of worlds, we cannot claim that one particular model represents the real world and the others do not.) It would seem to me that in this case the aggregate model derived from these different worlds has some utility beyond just one of these microfounded models. It is robust to alternative microfoundations.

In these circumstances, it would seem sensible to go straight to the aggregate model, and ignore microfoundations.

Paul Krugman is also doubtful of the value of microfoundations:

[W]hat we call “microfoundations” are not like physical laws. Heck, they’re not even true. Maximizing consumers are just a metaphor, possibly useful in making sense of behavior, but possibly not. The metaphors we use for microfoundations have no claim to be regarded as representing a higher order of truth than the ad hoc aggregate metaphors we use in IS-LM or whatever; in fact, we have much more supportive evidence for Keynesian macro than we do for standard micro.

Yours truly basically side with Wren-Lewis and Krugman on this issue, but I will try to explain why one might be even more critical and doubtful than they are re microfoundations of macroeconomics.

Microfoundations today means more than anything else that you try to build macroeconomic models assuming “rational expectations” and hyperrational “representative actors” optimizing over time. Both are highly questionable assumptions.

The concept of rational expectations was first developed by John Muth (1961) and later applied to macroeconomics by Robert Lucas (1972). Those macroeconomic models building on rational expectations microfoundations that are used today among both New Classical and “New Keynesian” macroconomists, basically assume that people on the average hold expectations that will be fulfilled. This makes the economist’s analysis enormously simplistic, since it means that the model used by the economist is the same as the one people use to make decisions and forecasts of the future.

Macroeconomic models building on rational expectations-microfoundations assume that people, on average, have the same expectations. Someone like Keynes for example, on the other hand, would argue that people often have different expectations and information, which constitutes the basic rational behind macroeconomic needs of coordination. Something that is rather swept under the rug by the extremely simple-mindedness of assuming rational expectations in representative actors models, which is so in vogue in New Classical and “New Keynesian” macroconomics. But if all actors are alike, why do they transact? Who do they transact with? The very reason for markets and exchange seems to slip away with the sister assumptions of representative actors and rational expectations.

Macroeconomic models building on rational expectations microfoundations impute beliefs to the agents that is not based on any real informational considerations, but simply stipulated to make the models mathematically-statistically tractable. Of course you can make assumptions based on tractability, but then you do also have to take into account the necessary trade-off in terms of the ability to make relevant and valid statements on the intended target system. Mathematical tractability cannot be the ultimate arbiter in science when it comes to modeling real world target systems. One could perhaps accept macroeconomic models building on rational expectations-microfoundations  if they had produced lots of verified predictions and good explanations. But they have done nothing of the kind. Therefore the burden of proof is on those macroeconomists who still want to use models built on these particular unreal assumptions.

In macroeconomic models building on rational expectations microfoundations –  where agents are assumed to have complete knowledge of all of the relevant probability distribution functions –  nothing really new happens, since they take for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution at all) that can be thought of taking place.

But in the real world, it is not possible to just assume that probability distributions are the right way to characterize, understand or explain acts and decisions made under uncertainty. When we simply do not know, when we have not got a clue, when genuine uncertainty prevail, macroeconomic models building on rational expectations-microfoundations simply will not do. In those circumstances it is not a useful assumption. The reason is that under those circumstances the future is not like the past, and henceforth, we cannot use the same probability distribution – if it at all exists – to describe both the past and future.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. We have to surpass macroeconomic models building on rational expectations-microfoundations and instead try to build economics on a more realistic foundation. A foundation that encompasses both risk and genuine uncertainty.

Macroeconomic models building on rational expectations microfoundations emanates from the belief that to be scientific, economics has to be able to model individuals and markets in a stochastic-deterministic way. It’s like treating individuals and markets as the celestial bodies studied by astronomers with the help of gravitational laws. Unfortunately, individuals, markets and entire economies are not planets moving in predetermined orbits in the sky.

To deliver macroeconomic models building on rational expectations microfoundations the economists have to constrain expectations on the individual and the aggregate level to be the same. If revisions of expectations take place they typically have to take in a known and prespecified precise way. This squares badly with what we know to be true in real world, where fully specified trajectories of future expectations revisions are no-existent.

Further, most macroeconomic models building on rational expectations microfoundations are time-invariant and so give no room for any changes in expectations and their revisions. The only imperfection of knowledge they admit of is included in the error terms, error terms that are assumed to be additive and to have a give and known frequency distribution, so that the models can still fully pre-specify the future even when incorporating these stochastic variables into the models.

In the real world there are many different expectations and these cannot be aggregated in macroeconomic models building on rational expectations microfoundations without giving rise to inconsistency. This is one of the main reasons for these models being modeled as representative actors models. But this is far from being a harmless approximation to reality. Even the smallest differences of expectations between agents would make these models inconsistent, so when they still show up they have to be considered “irrational”.

It is not possible to adequately represent individuals and markets as having one single overarching probability distribution. Accepting that, does not imply that we have to end all theoretical endeavours and assume that all agents always act totally irrationally and only are analyzable within behavioural economics. Far from it. It means we acknowledge diversity and imperfection, and that macroeconomics has to be able to incorporate these empirical facts in its models.

Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world). A model that has neither surface nor deep resemblance to important characteristics of real economies ought to be treated with prima facie suspicion. How could we possibly learn about the real world if there are no parts or aspects of the model that have relevant and important counterparts in the real world target system? The burden of proof lays on the macroeconomists thinking they have contributed anything of scientific relevance without even hinting at any bridge enabling us to traverse from model to reality. All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built assumptions made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is (no longer) the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But being able to model a world that somehow could be considered real or similar to the real world is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

The microfounded macromodel should enable us to posit contrafactual questions about what would happen if some variable was to change in a specific way (hence the assumption of structural invariance, that purportedly enables the theoretical economist to do just that). But does it? Applying a “Lucas critique” on most microfounded macromodels, it is obvious that they fail. Changing “policy rules” cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy.

Without export certificates models and theories should be considered unsold. Unfortunately this understanding has not informed modern neoclassical economics, as can be seen by the profuse use of so called representative agent models.

A common feature of modern neoclassical macroeconomics is to use simple (dynamic stochastic) general equilibrium models where representative actors are supposed to have complete knowledge, zero transaction costs and complete markets.

In these models, the actors are all identical. Of course, this has far-reaching analytical implications. Situations characterized by asymmetrical information – situations most of us consider to be innumerable – cannot arise in such models. If the aim is to build a macro-analysis from micro-foundations in this manner, the relevance of the procedure is highly questionable. Robert Solow has even considered the claims made by protagonists of rational agent models “generally phony”.

One obvious critique is that representative agent models do not incorporate distributional effects – effects that often play a decisive role in macroeconomic contexts. Investigations into the operations of markets and institutions usually find that there are overwhelming problems of coordination. These are difficult, not to say impossible, to analyze with the kind of Robinson Crusoe models that, e. g., real business cycle theorists employ and which exclude precisely those differences between groups of actors that are the driving force in many non-neoclassical analysis.

The choices of different individuals have to be shown to be coordinated and consistent. This is obviously difficult if the macroeconomic models don’t give room for heterogeneous individuals (this lack of understanding the importance of heterogeneity is perhaps especially problematic for the modeling of real business cycles in dynamic stochastic general equilibrium models). Representative agent models are certainly more manageable, however, from a realist point of view, they are also less relevant and have a lower explanatory potential. Or as Robert Gordon has it:

In the end, the problem with modern macro is that it contains too much micro and not enough macro. Individual representative agents assume complete and efficient markets and market clearing, while the models ignore the basic macro interactions implied by price stickiness, including macro externalities and coordination failures. In an economywide recession, most agents are not maximizing unconditional utility functions as in DSGE models but are maximizing, i.e., trying to make the best out of a bad situation, under biting income and liquidity constraints. Perceptive comments by others as cited above reject the relevance of modern macro to the current cycle of excess leveraging and subsequent deleveraging, because complete and efficient markets are assumed, and there is no room for default, bankruptcy, insolvency, and illiquidity.

Both the “Lucas critique” and Keynes’ critique of econometrics argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And as if this was not enough, there are obvious problems also with the kind of microeconomic equilibrium that one tries to reduce macroeconomics to. Decisions of consumption and production are described as choices made by a single agent. But then, who sets the prices on the market? And how do we justify the assumption of universal consistency between the choices?

Models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

And is it really possible to describe and analyze all the deliberations and choices made by individuals in an economy? Does not the choice of an individual presuppose knowledge and expectations about choices of other individuals? It probably does, and this presumably helps to explain why representative-agent models have become so popular in modern macroeconomic theory. They help to make the analysis more tractable.

One could justifiably argue that one might just as well accept that it is not possible to coherently reduce macro to micro, and accordingly that it is perhaps necessary to forswear microfoundations and the use of rational-agent models all together. Microeconomic reasoning has to build on macroeconomic presuppositions. Real individuals do not base their choices on operational general equilibrium models, but rather use simpler models. If macroeconomics needs microfoundations it is equally necessary that microeconomics needs macrofoundations.

The microeconomist Alan Kirman has maintained that the use of representative-agent models is unwarranted and leads to conclusions that are usually both misleading and false. It’s a fiction basically used by some macroeconomists to justify the use of equilibrium analysis and a kind of pseudo-microfoundations. Microeconomists are well aware that the conditions necessary to make aggregation to representative actors possible, are not met in actual economies. As economic models become increasingly complex, their use also becomes less credible.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, as we have already argued, even these microfoundations aren’t immutable. Lucas and the new classical economists’ deep parameters – “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

For Alfred Marshall economic theory was “an engine for the discovery of concrete truth”. But where Marshall tried to describe the behaviour of a typical business with the concept “representative firm”, his modern heirs don’t at all try to describe how firms interplay with other firms in an economy. The economy is rather described “as if” consisting of one single giant firm – either by inflating the optimization problem of the individual to the scale of a whole economy, or by assuming that it’s possible to aggregate different individuals’ actions by a simple summation, since every type of actor is identical. But do not we just have to face that it is difficult to describe interaction and cooperation when there is essentially only one actor?

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. But there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong – in the dustbin of history.

For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand waving that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

So – really – this is what the debate basically is all about. It’s not as Paul Krugman seems to mean – and although I’m a usually a big fan, honestly, it’s relly difficult to take him seriously here; it’s not even ”brilliantly silly”, but just silly) a question of ”gadgets”, ”scratchpads” or any other ”brillianty silly” toy that he or any other neoclassical economist chooses to play around with. That would be to trivialize economics and reduce it to a Glasperlenspiel.

Given that, I would say both Wren-Lewis and Krugman – especially if they really want to call themeselves Keynesians of any kind – ought to be even more critical of the microfoundationists than they are. If macroeconomic models – no matter of what ilk –  build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Incompatibility between actual behaviour and the behaviour in macroeconomic models building on representative actors and rational expectations microfoundations shows the futility of trying to represent real-world economis with models flagrantly at odds with reality.

In the conclusion to his book Models of Business Cycles (1987), Robert Lucas (in)famously wrote (p. 66 & 107-08):

It is remarkable and, I think, instructive fact that in nearly 50 years that Keynesian tradition has produced not one useful model of the individual unemployed worker, and no rationale for unemployment insurance beyond the observation that, in common with countercyclical cash grants to corporations or to anyone else, it has the effects of increasing the total volume of spending at the right times.  By dogmatically insisting that unemployment be classed as ‘involuntary’ this tradition simply cut itself off from serious thinking about the actual options unemployed people are faced with, and hence from learning anything about how the alternative social arrangements might improve these options.

The most interesting recent developments in macroeconomic theory seem to me describable as the reincorporation of aggregative problems such as inflation and the business cycle within the general framework of ‘microeconomic’ theory.  If these developments succeed, the term ‘macroeconomics’ will simply disappear from use and the modifier ‘micro’ will become superfluous.  We will simple speak, as did Smith, Ricardo, Marshall and Walras, of economic theory.  If we are honest, we will have to face the fact that at any given time there will be phenomena that are well-understood from the point of view of the economic theory we have, and other phenomena that are not.  We will be tempted, I am sure, to relieve the discomfort induced by discrepancies between theory and facts by saying the ill-understood facts are the province of some other, different kind of economic theory.  Keynesian ‘macroeconomics’ was, I think, a surrender (under great duress) to this temptation.  It led to the abandonment, for a class of problems of great importance, of the use of the only ‘engine for the discovery of truth’ that we have in economics.

Thanks to latter-day Lucasian New-Classical-New-Keynesian-Rational-Expectations-Representative-Agents-Microfoundations-economists, we are supposed not to – as our primitive ancestors – use that archaic term ‘macroeconomics’ anymore (with the possible exception of warning future economists not to give in to ‘discomfort.’)  Being intellectually heavily indebted to the man who invented macroeconomics – Keynes – yours truly firmly declines to concur.

Microfoundations – and a fortiori rational expectations and  representative agents – serve a particular theoretical purpose. And as the history of macroeconomics during the last thirty years has shown, this Lakatosian microfoundation programme for macroeconomics is only methodologically consistent within the framework of a (deterministic or stochastic) general equilibrium analysis. In no other context has it been possible to incorporate these kind of microfoundations, with its “forward-looking optimizing individuals,” into macroeconomic models.

This is of course not by accident. General equilibrium theory is basically nothing else than an endeavour to consistently generalize the microeconomics of individuals and firms on to the macroeconomic level of aggregates.

But it obviously doesn’t work. The analogy between microeconomic behaviour and macroeconomic behaviour is misplaced. Empirically, science-theoretically and methodologically, neoclassical microfoundations for macroeconomics are defective.  Tenable foundations for macroeconomics really have to be sought for elsewhere.

In an early post on the subject, Simon Wren-Lewis rhetorically asked:

Microfoundations – is there an alternative?

Of course there is an alternative to neoclassical general equilibrium microfoundations! Behavioural economics and Goldberg & Frydman’s ”imperfect knowledge” economics being two noteworthy examples that easily come to mind.

And for those of us who have not forgotten the history of our discipline, and not bought the sweet-water nursery tale of Lucas et consortes that Keynes was not “serious thinking,” we can easily see that there exists a macroeconomic tradition inspired by Keynes (that has absolutely nothing to do with any New Synthesis or “New Keynesianism” to do).

Its ultimate building-block is the perception of genuine uncertainty and that people often “simply do not know.” Real actors can’t know everything and their acts and decisions are not simply possible to sum or aggregate without the economist risking to succumb to “the fallacy of composition”.

Instead of basing macroeconomics on unreal and unwarranted generalizations of microeconomic behaviour and relations, it is far better to accept the ontological fact that the future to a large extent is uncertain, and rather conduct macroeconomics on this fact of reality.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming uncertainty to be reducible to stochastic risk. That is scientific cheating. And it has been going on for too long now.

The Keynes-inspired building-blocks are there. But it is admittedly a long way to go before the whole construction is in place. But the sooner we are intellectually honest and ready to admit that ”modern” neoclassical macroeconomics and its microfoundationalist programme has come to way’s end – the sooner we can redirect are aspirations and knowledge in more fruitful endeavours.

Krugman responding to my critique

15 juli, 2012 kl. 19:53 | Publicerat i Economics | Kommentarer inaktiverade för Krugman responding to my critique

Just noted that Paul Krugman has a new post up responding to my critique of ”brilliantly silly” neoclassical macroeconomics.

Since on tour – again -I’ll have to wait until tomorrow with fresh comments on IS-LM and all the rest of more or less modern macroeconomic gadgetery.

Dumb and dumber in modern macroeconomics

15 juli, 2012 kl. 14:15 | Publicerat i Economics | 8 kommentarer

Oxford professor Simon Wren-Lewis has a new post up today on the elevated gadgets of modern macroeconomics:

So while I think I was right to define modern macro by its methodology, I also agree that this methodology can assume an importance that can be dangerous. I think this is a danger that economics is particularly prone to, because its methodology is essentially deductive in character, and economists are very attached to their rationality axioms. The microfoundation of macroeconomic theory means that macro is subject to that same danger. 

If he had ended his post there, I wouldn’t have had to write any comment at all, but rather silently infer that after all we weren’t thinking importantly different on the issue. But recognising the danger of this ”misinterpretation” of his standpoint, Wren-Lewis quickly adds:                 

That would be a nice way to end this post from a rhetorical point of view, but I suspect if I did, some would misinterpret what I say as implying that the majority of mainstream macroeconomists routinely make mistakes of this kind, or worse still that the microfoundation of macro was a mistake. I believe neither of those things, and I have been rather more positive than Professor Krugman in the past on what modern macro has achieved.

So – here we go!

The purported strength of new-classical and new-Keynesian macroeconomics is that they have firm anchorage in preference-based microeconomics, and especially the decisions taken by inter-temporal utility maximizing “forward-loooking” individuals.

To some of us, however, this has come at too high a price. The almost quasi-religious insistence that macroeconomics has to have microfoundations – without ever presenting neither ontological nor epistemological justifications for this claim – has put a blind eye to the weakness of the whole enterprise of trying to depict a complex economy based on an all-embracing representative actor equipped with superhuman knowledge, forecasting abilities and forward-looking rational expectations. It is as if – after having swallowed the sour grapes of the Sonnenschein-Mantel-Debreu-theorem – these economists want to resurrect the omniscient walrasian auctioneer in the form of all-knowing representative actors equipped with rational expectations and assumed to somehow know the true structure of our model of the world (how that could even be conceivable is beyond my imagination, given that the ongoing debate on microfoundations, if anything, shows that not even we, the economists, can come to agreement on a common model).

Following the greatest economic depression since the 1930s, the grand old man of modern economic growth theory, Nobel laureate Robert Solow, on July 20, 2010, gave a prepared statement on “Building a Science of Economics for the Real World” for a hearing in the U. S. Congress. According to Solow modern macroeconomics has not only failed at solving present economic and financial problems, but is “bound” to fail. Building dynamically stochastic general equilibrium models (DSGE) on “assuming the economy populated by a representative agent” – consisting of “one single combination worker-owner-consumer-everything-else who plans ahead carefully and lives forever” – do not pass “the smell test: does this really make sense?” One cannot but concur in Solow’s surmise that a thoughtful person “faced with the thought that economic policy was being pursued on this basis, might reasonably wonder what planet he or she is on.”

Already in 2008 Solow had – in ”The State of Macroeconomics” (Journal of Economic Perspectives 2008:243-249) – told us of what he thought of microfounded modern macroeconomics:

[When modern macroeconomists] speak of macroeconomics as being firmly grounded in economic theory, we know what they mean … They mean a macroeconomics that is deduced from a model in which a single immortal consumer-worker-owner maximizes a perfectly conventional time-additive utility function over an infinite horizon, under perfect foresight or rational expectations, and in an institutional and technological environment that favors universal price-taking behavior …

No one would be driven to accept this story because of its obvious “rightness”. After all, a modern economy is populated by consumers, workers, pensioners, owners, managers, investors, entrepreneurs, bankers, and others, with different and sometimes conflicting desires, information, expectations, capacities, beliefs, and rules of behavior … To ignore all this in principle does not seem to qualify as mere abstraction – that is setting aside inessential details. It seems more like the arbitrary suppression of clues merely because they are inconvenient for cherished preconceptions …

Friends have reminded me that much effort of ‘modern macro’ goes into the incorporation of important deviations from the Panglossian assumptions … [But] a story loses legitimacy and credibility when it is spliced to a simple, extreme, and on the face of it, irrelevant special case. This is the core of my objection: adding some realistic frictions does not make it any more plausible than an observed economy is acting out the desires of a single, consistent, forward-looking intelligence …

It seems to me, therefore, that the claim that ‘modern macro’ somehow has the special virtue of following the principles of economic theory is tendentious and misleading … The other possible defense of modern macro is that, however special it may seem, it is justified empirically. This strikes me as a delusion …

So I am left with a puzzle, or even a challenge. What accounts for the ability of ‘modern macro’ to win hearts and minds among bright and enterprising academic economists? … There has always been a purist streak in economics that wants everything to follow neatly from greed, rationality, and equilibrium, with no ifs, ands, or buts … The theory is neat, learnable, not terribly difficult, but just technical enough to feel like ‘science’. Moreover it is practically guaranteed to give laissez-faire-type advice, which happens to fit nicely with the general turn to the political right that began in the 1970s and may or may not be coming to an end.

So, of course, I could just as well have directed Wren-Lewis to Robert Solow’s article. There the answer to what’s wrong with the modern microfounded macroeconomics of Wren-Lewis et consortes was given already four years ago.

And in case you’re still not convinced – here’s another masterpiece that essentially says it all:

So how did macroeconomics arrive at its current state?

The original impulse to look for better or more explicit micro foundations was probably reasonable. What emerged was not a good idea. The preferred model has a single representative consumer optimizing over infinite time with perfect foresight or rational expectations, in an environment that realizes the resulting plans more or less flawlessly through perfectly competitive forward-looking markets for goods and labor, and perfectly flexible prices and wages.

How could anyone expect a sensible short-to-medium-run macroeconomics to come out of that set-up? My impression is that this approach (which seems now to be the mainstream, and certainly dominates the journals, if not the workaday world of macroeconomics) has had no empirical success; but that is not the point here. I start from the presumption that we want macroeconomics to account for the occasional aggregative pathologies that beset modern capitalist economies, like recessions, intervals of stagnation, inflation, ”stagflation,” not to mention negative pathologies like unusually good times. A model that rules out pathologies by definition is unlikely to help. It is always possible to claim that those ”pathologies” are delusions, and the economy is merely adjusting optimally to some exogenous shock. But why should reasonable people accept this? …

What is needed for a better macroeconomics? [S]ome of the gross implausibilities … need to be eliminated. The clearest candidate is the representative agent. Heterogeneity is the essence of a modern economy. In real life we worry about the relations between managers and shareowners, between banks and their borrowers, between workers and employers, between venture capitalists and entrepreneurs, you name it. We worry about those interfaces because they can and do go wrong, with likely macroeconomic consequences. We know for a fact that heterogeneous agents have different and sometimes conflicting goals, different information, different capacities to process it, different expectations, different beliefs about how the economy works. Representative-agent models exclude all this landscape, though it needs to be abstracted and included in macro-models.

I also doubt that universal rational expectations provide a useful framework for macroeconomics …

Now here is a peculiar thing. When I was in advanced middle age, I suddenly woke up to the fact that my colleagues in macroeconomics, the ones I most admired, thought that the fundamental problem of macro theory was to understand how nominal events could have real consequences. This is just a way of stating some puzzle or puzzles about the sources for sticky wages and prices. This struck me as peculiar in two ways.

First of all, when I was even younger, nobody thought this was a puzzle. You only had to look around you to stumble on a hundred different reasons why various prices and factor prices should be much less than perfectly flexible. I once wrote, archly I admit, that the world has its reasons for not being Walrasian. Of course I soon realized that what macroeconomists wanted was a formal account of price stickiness that would fit comfortably into rational, optimizing models. OK, that is a harmless enough activity, especially if it is not taken too seriously. But price and wage stickiness themselves are not a major intellectual puzzle unless you insist on making them one.

Robert Solow, ”Dumb and dumber in macroeconomics”

Overconfident economists (wonkish)

14 juli, 2012 kl. 18:48 | Publicerat i Economics, Statistics & Econometrics | 1 kommentar

Have you ever thought economists were far more confident in their statements about the world than they had any right to be? Well, now there’s proof.

It comes from Emre Soyer … and his professor Robin Hogarth … What Soyer and Hogarth did was get 257 economists to read about a regression analysis that related independent variable X to dependent variable Y, then answer questions about the probabilities of various outcomes …

When the results were presented in the way empirical results usually are presented in economics journals — as the average outcomes of the regression followed by a few error terms — the economists did a really bad job of answering the questions. They paid too much attention to the averages, and too little to the uncertainties inherent in them, thereby displaying too much confidence.

When the economists were shown the numerical results plus scatter graphs of the same data, they did slightly better. The economists who were shown only the graphs and none of the numerical results, meanwhile, actually got most of the answers right, or close to right.

The bigger point here, which Soyer and Hogarth have elaborated in other research, is that we tend to understand probabilistic information much better when it’s presented in visual form than if we’re just shown the numbers. (This was also a key argument of Sam Savage’s edifying and entertaining 2009 book The Flaw of Averages.) What’s so interesting is to learn that statistically literate experts are just as likely to glom onto the point estimate and discount the uncertainty as, say, innumerate journalists reporting the results of political polls …

Hogarth … said it was important to focus on economists because they’re ”very arrogant people” (he taught for 22 years at the University of Chicago’s business school, so he should know) and tend to rely heavily on regression analyses without really thinking through the implications of those analyses. The whole point of doing a regression is to make a prediction that a relationship between variables discovered in past data will hold up in the future. But economists — despite Milton Friedman’s famous claim that prediction is what the discipline is all about — seem unwilling to make those predictions explicit and express their level of confidence in them, thereby giving short shrift to the uncertainty and error inherent in their work.

”My concern,” Hogarth said, ”is that when reading economics journal articles you get the impression that the world is much more predictable than it is.” That’s true enough. It’s also true of the way forecasts are usually presented outside of academic economics — in business and in government, for example. We focus on the estimate, and put the uncertainty in a footnote, if anywhere. And when somebody tries to communicate their forecasts with the uncertainty front and center, like the Bank of England with its awesome fan charts, they often catch flak for it. Wanting to be more certain about the future than we have any right to be may well be an ineradicable human trait. But hey, at least somebody has identified a treatment: More scatter graphs!

Justin Fox

« Föregående sidaNästa sida »

Blogga med WordPress.com.
Entries och kommentarer feeds.