In our extended NAIRU model, labor productivity growth is included in the wage bargaining process … The logical consequence of this broadening of the theoretical canvas has been that the NAIRU becomes endogenous itself and ceases to be an attractor — Milton Friedman’s natural, stable and timeless equilibrium point from which the system cannot permanently deviate. In our model, a deviation from the initial equilibrium affects not only wages and prices (keeping the rest of the system unchanged) but also demand, technology, workers’ motivation, and work intensity; as a result, productivity growth and ultimately equilibrium unemployment will change. There is in other words, nothing natural or inescapable about equilibrium unemployment, as is Friedman’s presumption, following Wicksell; rather, the NAIRU is a social construct, fluctuating in response to fiscal and monetary policies and labor market interventions. Its ephemeral (rather than structural) nature may explain why the best economists working on the NAIRU have persistently failed to agree on how high the NAIRU actually is and how to estimate it.
Many politicians and economists subscribe to the NAIRU story and its policy implication that attempts to promote full employment is doomed to fail, since governments and central banks can’t push unemployment below the critical NAIRU threshold without causing harmful runaway inflation.
Although this may sound convincing, it’s totally wrong!
One of the main problems with NAIRU is that it essentially is a timeless long-run equilibrium attractor to which actual unemployment (allegedly) has to adjust. But if that equilibrium is itself changing — and in ways that depend on the process of getting to the equilibrium — well, then we can’t really be sure what that equlibrium will be without contextualizing unemployment in real historical time. And when we do, we will — as highlighted by Storm and Naastepad — see how seriously wrong we go if we omit demand from the analysis. Demand policy has long-run effects and matters also for structural unemployment — and governments and central banks can’t just look the other way and legitimize their passivity re unemployment by refering to NAIRU.
The existence of long-run equilibrium is a very handy modeling assumption to use. But that does not make it easily applicable to real-world economies. Why? Because it is basically a timeless concept utterly incompatible with real historical events. In the real world it is the second law of thermodynamics and historical — not logical — time that rules.
This importantly means that long-run equilibrium is an awfully bad guide for macroeconomic policies. In a world full of genuine uncertainty, multiple equilibria, asymmetric information and market failures, the long run equilibrium is simply a non-existent unicorn.
NAIRU does not hold water simply because it does not exist — and to base economic policies on such a weak theoretical and empirical construct is nothing short of writing out a prescription for self-inflicted economic havoc.
NAIRU is a useless concept, and the sooner we bury it, the better.
Almost everything we do these days leaves some kind of data trace in some computer system somewhere. When such data is aggregated into huge databases it is called “Big Data”. It is claimed social science will be transformed by the application of computer processing and Big Data. The argument is that social science has, historically, been “theory rich” and “data poor” and now we will be able to apply the methods of “real science” to “social science” producing new validated and predictive theories which we can use to improve the world.
What’s wrong with this? … Firstly what is this “data” we are talking about? In it’s broadest sense it is some representation usually in a symbolic form that is machine readable and processable. And how will this data be processed? Using some form of machine learning or statistical analysis. But what will we find? Regularities or patterns … What do such patterns mean? Well that will depend on who is interpreting them …
Looking for “patterns or regularities” presupposes a definition of what a pattern is and that presupposes a hypothesis or model, i.e. a theory. Hence big data does not “get us away from theory” but rather requires theory before any project can commence.
What is the problem here? The problem is that a certain kind of approach is being propagated within the “big data” movement that claims to not be a priori committed to any theory or view of the world. The idea is that data is real and theory is not real. That theory should be induced from the data in a “scientific” way.
I think this is wrong and dangerous. Why? Because it is not clear or honest while appearing to be so. Any statistical test or machine learning algorithm expresses a view of what a pattern or regularity is and any data has been collected for a reason based on what is considered appropriate to measure. One algorithm will find one kind of pattern and another will find something else. One data set will evidence some patterns and not others. Selecting an appropriate test depends on what you are looking for. So the question posed by the thought experiment remains “what are you looking for, what is your question, what is your hypothesis?”
Ideas matter. Theory matters. Big data is not a theory-neutral way of circumventing the hard questions. In fact it brings these questions into sharp focus and it’s time we discuss them openly.
This paper … looks back into the pre-crisis (pre-2007) intellectual history of macroeconomic theory and argues that modern macro neglects the basic sources of both impulses and propagation mechanisms of business cycles. The basic problem is that modern macro consists of too much micro and not enough macro. Focus on individual preferences and production functions misses the essence of macro fluctuations — the coordination failures and macro externalities that convert interactions among individual choices into constraints that prevent workers from optimizing hours of work and firms from optimizing sales, production, and utilization. Also modern business-cycle macro has too narrow a view of the range of aggregate demand shocks that in the presence of sticky prices constrain the choices of workers and firms. Shocks that have little or nothing to do with technology, preferences, or monetary policy can interact and impose constraints on individual choices …
Modern business cycle macro is littered with contradictions resulting from its attempts to combine market clearing and utility maximization at the level of the individual household with a form of price rigidity or friction. Once the baby of full price flexibility has been thrown out, the bathwater must be changed because price rigidity is logically incompatible with market clearing … The contradictions come when modern macroeconomists attempt to explain non-market-clearing outcomes with market‐clearing language, or in Blanchard’s (2008) words “movements take place along a labor supply curve … this may give a misleading description of fluctuations.”
If all agents are supposed to have rational expectations, it becomes convenient to assume also that they all have the same expectation and thence tempting to jump to the conclusion that the collective of agents behaves as one. The usual objection to representative agent models has been that it fails to take into account well-documented systematic differences in behaviour between age groups, income classes, etc. In the financial crisis context, however, the objection is rather that these models are blind to the consequences of too many people doing the same thing at the same time, for example, trying to liquidate very similar positions at the same time. Representative agent models are peculiarly subject to fallacies of composition. The representative lemming is not a rational expectations intertemporal optimising creature. But he is responsible for the fat tail problem that macroeconomists have the most reason to care about …
For many years now, the main alternative to Real Business Cycle Theory has been a somewhat loose cluster of models given the label of New Keynesian theory. New Keynesians adhere on the whole to the same DSGE modeling technology as RBC macroeconomists but differ in the extent to which they emphasise inflexibilities of prices or other contract terms as sources of shortterm adjustment problems in the economy. The “New Keynesian” label refers back to the “rigid wages” brand of Keynesian theory of 40 or 50 years ago. Except for this stress on inflexibilities this brand of contemporary macroeconomic theory has basically nothing Keynesian about it.
The obvious objection to this kind of return to an earlier way of thinking about macroeconomic problems is that the major problems that have had to be confronted in the last twenty or so years have originated in the financial markets – and prices in those markets are anything but “inflexible”.
And still mainstream economists seem to be impressed by the ‘rigour’ brought to macroeconomics by New-Classical-New-Keynesian DSGE models and its rational expectations and micrcofoundations!
It is difficult to see why.
Take the rational expectations assumption. Rational expectations in the mainstream economists’ world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.
‘Rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.
No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not say anything about real world economies.
Proving things ‘rigorously’ in DSGE models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.
Mainstream economists think there is a gain from the DSGE style of modeling in its capacity to offer some kind of structure around which to organise discussions. To me that sounds more like a religious theoretical-methodological dogma, where one paradigm rules in divine hegemony. That’s not progress. That’s the death of economics as a science.
I think there is an element of truth in the view that the superstition that the budget must be balanced at all times [is necessary]. Once it is debunked, [it] takes away one of the bulwarks that every society must have against expenditure out of control. There must be discipline in the allocation of resources or you will have anarchistic chaos and inefficiency. And one of the functions of old fashioned religion was to scare people by sometimes what might be regarded as myths into behaving in a way that the long-run civilized life requires. We have taken away a belief in the intrinsic necessity of balancing the budget if not in every year, [and then] in every short period of time. If Prime Minister Gladstone came back to life he would say “oh, oh what you have done” and James Buchanan argues in those terms. I have to say that I see merit in that view.
Samuelson’s statement makes me come to think of the following passage in Keynes’ General Theory:
The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be exempt from any intellectual influences, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.
Wonder why …
I discussed long ago what it means to be heterodox in economics. Bob Kuttner, who I once saw giving a talk at the New School (in the 1990s), a very sharp journalist that knows quite a bit about economics, sings the praises of Dani Rodrik as an heterodox economist …
Rodrik is, or was a few years ago at least, in what Colander, Holt and Rosser refer to as the cutting edge of the profession … which is to say he is heterodox in the same way that Joe Stiglitz or Paul Krugman are heterodox. They are willing to suggest that some imperfections make the laissez-faire dream of the most fundamentalist neoclassical authors somewhat overstated. But as much as Krugman and Stiglitz accept the conventional macro model, with the natural rate hypothesis, the same is true for Rodrik, which essentially accepts the basic Heckscher-Ohlin-Samuelson trade model (for a critique go here). He is a moderate neoclassical economist; a potty trained one if you will, but certainly not heterodox.
Economics students today are complaining more and more about the way economics is taught. The lack of fundamantal diversity — not just path-dependent elaborations of the mainstream canon — and narrowing of the curriculum, dissatisfy econ students all over the world. The frustrating lack of real world relevance has led many of them to demand the discipline to start develop a more open and pluralistic theoretical and methodological attitude.
Dani Rodrik has little understanding for these views, finding it hard to ‘understand these complaints in the light of the patent multiplicity of models within economics.’ Rodrik shares the view of his colleauges Paul Krugman, Greg Mankiw and Simon Wren-Lewis — all of whom he approvingly cites in his book Economics Rules — that there is nothing basically wrong with ‘standard theory’ and ‘economics textbooks.’ As long as policy makers and economists stick to ‘standard economic analysis’ everything is fine. Economics is just a method that makes us ‘think straight’ and ‘reach correct answers.’
Writes Rodrik in Economics Rules:
Pluralism with respect to conclusions is one thing; pluralism with respect to methods is something else … An aspiring economist has to formulate clear models … These models can incorporate a wide range of assumptions … but not all assumptions are equally acceptable. In economics, this means that the greater the departure from benchmark assumptions, the greater the burden of justifying and motivating why those departures are needed …
Some methods are better than others … For some these constraints represent a kind of methodological straitjacket that crowds out new thinking. But it is easy to exaggerate the rigidity of the rules within which the profession operates.
Young economics students that want to see a real change in economics and the way it’s taught, have to look beyond Rodrik, Mankiw, Krugman & Co. Those future economists who really want something other than the same old mainstream neoclassical catechism; those who really don’t want to be force-fed with mainstream neoclassical deductive-axiomatic analytical formalism, have to look elsewhere.
Just as Stiglitz and Krugman, Rodrik likes to present himself as a kind of pluralist anti-establishment economics iconoclast, but when it really counts, he shows what he is — a mainstream neoclassical economist fanatically defending the relevance of standard economic modeling strategies. In other words — no heterodoxy where it really would count.
Almost all the change and diversity that Rodrik applauds only takes place within the analytic-formalistic modeling strategy that makes up the core of mainstream economics. All the flowers that do not live up to the precepts of the mainstream methodological canon are pruned. You’re free to take your analytical formalist models and apply it to whatever you want – as long as you do it using a modeling methodology acceptable to the mainstream. If you do not follow this particular mathematical-deductive analytical formalism you’re not even considered doing economics. “If it isn’t modeled, it isn’t economics.” This isn’t pluralism. It’s a methodological reductionist straightjacket.
In Rodrik’s world “newer generations of models do not render the older generations wrong or less relevant,” but “simply expand the range of the discipline’s insights.” I don’t want to sound derisory or patronizing, but although it’s easy to say what Rodrik says, we cannot have our cake and eat it. Analytical formalism doesn’t save us from either specifying the intended areas of application of the models, or having to accept them as rival models facing the risk of being put to the test and found falsified.
The insistence on using analytical formalism and mathematical methods comes at a high cost — it often makes the analysis irrelevant from an empirical-realist point of view.
No matter how many thousands of models mainstream economists come up with, as long as they are just axiomatic variations of the same old mathematical-deductive ilk, they are not heterodox in any substantial way, and they will not take us one single inch closer to giving us relevant and usable means to further our understanding and explanation of real economies.
Scientific progress … is frequently the result of observation that something does work, which runs far ahead of any understanding of why it works.
Not within the economics profession. There, deductive reasoning based on logical inference from a specific set of a priori deductions is “exactly the right way to do things”. What is absurd is not the use of the deductive method but the claim to exclusivity made for it. This debate is not simply about mathematics versus poetry. Deductive reasoning necessarily draws on mathematics and formal logic: inductive reasoning, based on experience and above all careful observation, will often make use of statistics and mathematics …
The belief that models are not just useful tools but are capable of yielding comprehensive and universal descriptions of the world blinded proponents to realities that had been staring them in the face. That blindness made a big contribution to our present crisis, and conditions our confused responses to it.
The ‘deductivist blindness’ of mainstream economics explains to a larger extent why it contributes to causing economic crises rather than to solving them. But where does this ‘deductivist blindness’ of mainstream economics come from? To answer that question we have to examine the methodology of mainstream economics.
The insistence on constructing models showing the certainty of logical entailment has been central in the development of mainstream economics. Insisting on formalistic (mathematical) modeling has more or less forced the economist to give upon on realism and substitute axiomatics for real world relevance. The price paid for the illusory rigour and precision has been monumentally high
This deductivist orientation is the main reason behind the difficulty that mainstream economics has in terms of understanding, explaining and predicting what takes place in our societies. But it has also given mainstream economics much of its discursive power – at least as long as no one starts asking tough questions on the veracity of – and justification for – the assumptions on which the deductivist foundation is erected. Asking these questions is an important ingredient in a sustained critical effort at showing how nonsensical is the embellishing of a smorgasbord of models founded on wanting (often hidden) methodological foundations.
The mathematical-deductivist straitjacket used in mainstream economics presupposes atomistic closed-systems – i.e., something that we find very little of in the real world, a world significantly at odds with an (implicitly) assumed logic world where deductive entailment rules the roost. Ultimately then, the failings of modern mainstream economics has its root in a deficient ontology. The kind of formal-analytical and axiomatic-deductive mathematical modeling that makes up the core of mainstream economics is hard to make compatible with a real-world ontology. It is also the reason why so many critics find mainstream economic analysis patently and utterly unrealistic and irrelevant.
Although there has been a clearly discernible increase and focus on ’empirical’ economics in recent decades, the results in these research fields have not fundamentally challenged the main deductivist direction of mainstream economics. They are still mainly framed and interpreted within the core axiomatic assumptions of individualism, instrumentalism and equilibrium that make up even the ‘new’ mainstream economics. Although, perhaps, a sign of an increasing – but highly path-dependent – theoretical pluralism, mainstream economics is still, from a methodological point of view, mainly a deductive project erected on a foundation of empty formalism.
There was an unusual degree of consensus among economists about what would happen if Britain voted for Brexit in the referendum on June 23 last year. The language used by the International Monetary Fund was typical: It expressed fears of an “abrupt reaction,” adding that this “may have already begun” …
What happened instead was that Britain enjoyed the best growth of any major advanced economy in 2016 … Andy Haldane compared the pitfalls of economic prediction to the single most famously wrong weather forecast in British history, made on the BBC on Oct. 15, 1987. A woman had called the BBC to say she was worried there was a hurricane on the way. “Don’t worry, there isn’t,” the weatherman responded. That night, 22 people died amid hurricane-force winds …
The reason this poses a deep intellectual crisis for macro-economics is that the entire point of the field, as it has developed since the work of John Maynard Keynes in the 1930s, is to prevent just this sort of severe downturn. Keynes once spoke of a future in which economists would be “humble, competent people on a level with dentists” … It seems to me, though, that what macroeconomists do is really most like bomb disposal. Uniquely in the social sciences and humanities, macroeconomics was developed with a specific, real-world purpose, and a negative purpose to boot: to stop anything like the Great Depression from ever happening again. Given this goal — to avert systemic crises and downturns — the credit crunch and the Great Recession were, for macroeconomics, an intellectual disaster.
In retrospect, the failure of the discipline to predict and prevent the crisis was based on deep conceptual faults. One of these concerned a mysterious refusal to engage with the role of the banking and finance system in the economy. Another was the assumption that the discipline makes about individual motivations, assuming that individuals “optimize” their decision-making to behave, in economic terms, rationally. This is a convenient intellectual shortcut for building models, but it is also a fiction, as we know not just from our own human experience but even from within economics itself, where microeconomics has recently made exciting progress in the study of human irrationality, bias and cognitive error. It is a matter of provable fact that our decision-making is not entirely rational. Economic models built on the premise of our rationality will always have a creaky underpinning.
Reading Lancaster’s article is certainly a very worrying confirmation of what Paul Romer wrote a couple of months ago — modern macroeconomics is becoming more and more a total waste of time.
One of the problems with macroeconomics that Lancaster
doesn’t discuss is its obsessive mathematization since WW II. This has made mainstream neoclassical economists more or less obsessed with formal, deductive-axiomatic models. Confronted with the critique that they do not solve real problems, they often react as Saint-Exupéry’s Great Geographer, who, in response to the questions posed by The Little Prince, says that he is too occupied with his scientific work to be be able to say anything about reality. Confronting economic theory’s lack of relevance and ability to tackle real problems, these economists retreat to the wonderful world of economic models. They enter the tool shed — and stay there. While the economic problems in the world around us steadily increase, they are rather happily playing along with the latest toys in the mathematical toolbox.
Instead of making the model the message, I think we are better served by economists who more than anything else try to contribute to solving real problems. And then the motto of John Maynard Keynes is more valid than ever:
It is better to be vaguely right than precisely wrong
Lynn Parramore: Do you think there are lessons in what has happened in the Eurozone for students of economics and the way the subject is taught?
Mario Seccareccia: Yes, indeed. Ever since the establishment of the modern nation-state in the late eighteenth and nineteenth centuries, the creation of the euro was perhaps the first significant experiment in modern times in which there was an attempt to separate money from the state, that is, to denationalize currency, as some right-wing ideologues and founders of modern neoliberalism, such as Friedrich von Hayek, had defended. What the Eurozone crisis teaches is that this perception of how the monetary system works is quite wrong, because, in times of crisis, the democratic state must be able to spend money in order to meet its obligations to its citizens. The denationalization or “supra-nationalization” of money with the establishment that happened in the Eurozone took away from elected national governments the capacity to meaningfully manage their economies. Unless governments in the Eurozone are able to renegotiate a significant control and access money from their own central banks, the system will be continually plagued with crisis and will probably collapse in the longer term.