Krugman’s modeling flim flam

30 Apr, 2016 at 11:15 | Posted in Economics | 3 Comments

Paul Krugman has a piece up on his blog this week arguing that the ‘discipline of modeling’ is a sine qua non for tackling politically and emotionally charged economic issues:

You might say that the way to go about research is to approach issues with a pure heart and mind: seek the truth, and derive any policy conclusions afterwards. But that, I suspect, is rarely how things work. After all, the reason you study an issue at all is usually that you care about it, that there’s something you want to achieve or see happen. Motivation is always there; the trick is to do all you can to avoid motivated reasoning that validates what you want to hear.

economist-nakedIn my experience, modeling is a helpful tool (among others) in avoiding that trap, in being self-aware when you’re starting to let your desired conclusions dictate your analysis. Why? Because when you try to write down a model, it often seems to lead some place you weren’t expecting or wanting to go. And if you catch yourself fiddling with the model to get something else out of it, that should set off a little alarm in your brain.

Hmm …

So when Krugman and other ‘modern’ mainstream economists use their models — standardly assuming rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative agents with homothetic and identical preferences, etc. — and standardly ignoring complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc. — we are supposed to believe that this somehow helps them ‘to avoid motivated reasoning that validates what you want to hear.’

Yours truly  is, to say the least,straight jacket far from convinced. The alarm that sets off in my brain is that this, rather than being helpful for understanding real world economic issues, sounds more like an ill-advised plaidoyer for voluntarily taking on a methodological straight-jacket of unsubstantiated and known to be false assumptions.

‘New Keynesian’ schizophrenia

29 Apr, 2016 at 20:57 | Posted in Economics | Comments Off on ‘New Keynesian’ schizophrenia

Taking part of the debate on microfoundations among macroeconomists these days, I wonder if Heinz-Peter Spahn isn’t more on the right track than those who desperately offer more or less contrived defenses of the microfoundationalist programme:

The crucial point however is: market conditions, which are presupposed in the model of intertemporal choice, are not given in reality. Distributing consumption optimally over time depends on the possibility of individuals to lend money on their permanent income, if temporary periods of low market income are to be bridged. Because this perfect financial market does not exist, consumption behaviour necessarily depends  strongly on current income. Consumers know that their future expected income is distorted by spells of unemployment, the occurrence of which is hard to predict though; these quantity constraints are important also for firms …

Professional modern economics appear to suffer from schizophrenia as in the field of financial-market economics all these deviations from the Utopian ideal market are well known (information asymmetries etc.), which are stubbornly ignored when it comes to talk about macroeconomics in NKM [New Keynesian Macroeconomics]. The assumption of complete markets means that all agents’ intertemporal budget constraints always are satisfied, bankruptcies and insolvencies are impossible. The NKM world is populated by agents who never default …  Basically, NKM designs a non-monetary economy … Questions regarding financial instability cannot be answered within this models, they cannot even be asked …

NKM faces an uncomfortable trade-off. On the one hand, General Equilibrium Theory has shown that preferences and behaviour of heterogeneous agents cannot simply be aggregated. Variances between individuals matter! The Sonnenschein-Mantel-Debreu problem states that choices may not be transitive; the representative agent’s ranking differs from individual rankings; reactions to shock may be different … On the other hand, if people are assumed to be identical, NKM may keep the representative agent, but as a consequence the model has no interaction of agents, no distribution problems, no asymmetric information and no meaningful stock market.

The critique so far may appear as unfair as it neglects the various refinements that were proposed in order to develop and improve the basic model set-up … But these extensions of NKM – due to the Walrasian method – yield many precisely-looking results … but do not grasp the impact of bank credit on goods demand, market income and employment in a typical monetary economy.

Heinz-Peter Spahn

The money multiplier – neat, plausible, and utterly wrong

28 Apr, 2016 at 18:02 | Posted in Economics | 9 Comments

The neoclassical textbook concept of money multiplier assumes that banks automatically expand the credit money supply to a multiple of their aggregate reserves.  If the required currency-deposit reserve ratio is 5%, the money supply should be about twenty times larger than the aggregate reserves of banks.  In this way the money multiplier concept assumes that the central bank controls the money supply by setting the required reserve ratio.

In his Macroeconomics – just to take an example – Greg Mankiw writes:

We can now see that the money supply is proportional to the monetary base. The factor of proportionality … is called the money multiplier … Each dollar of the monetary base produces m dollars of money. Because the monetary base has a multiplied effect on the money supply, the monetary base is called high-powered money.

The money multiplier concept is – as can be seen from the quote above – nothing but one big fallacy. This is not the way credit is created in a monetary economy. It’s nothing but a monetary myth that the monetary base can play such a decisive role in a modern credit-run economy with fiat money.

In the real world banks first extend credits and then look for reserves. So the money multiplier basically also gets the causation wrong. At a deep fundamental level the supply of money is endogenous.

garbageOne may rightly wonder why on earth this pet neoclassical fairy tale is still in the textbooks and taught to economics undergraduates. Giving the impression that banks exist simply to passively transfer savings into investment, it is such a gross misrepresentation of what goes on in the real world, that there is only one place for it — and that is in the garbage can!

Economic rebellion

24 Apr, 2016 at 08:35 | Posted in Economics | 8 Comments


Listen to the program here.

Mainstream economists like Paul Krugman and Simon Wren-Lewis think that yours truly and other heterodox economists are wrong in blaming mainstream economics for not being real-world relevant and pluralist. To Krugman there is nothing wrong with ‘standard theory’ and ‘economics textbooks.’ If only policy makers and economists stick to ‘standard economic analysis’ everything would be just fine.

I’ll be dipped! If there’s anything the last decade has shown us, it is that economists have gone astray in their tool shed. Krugman’s ‘standard theory’ — mainstream neoclassical economics — has contributed to causing today’s economic crisis rather than to solving it.
Rethinking econ_0

A small ray of hope

23 Apr, 2016 at 16:14 | Posted in Economics | 3 Comments


I overheard a conversation between two high school students this morning.

The first person was asking about which classes the second was going to take next. One of those mentioned was microeconomics.

“Oh, that’s easy” said the first, “You just have to remember that it’s all rubbish – they want you to believe that people are rational, and that there’s all this perfection in the world.”

“Really?” responded the second, “That’s really dumb. I wonder why they do that?”

“It doesn’t matter, it’s economics”

“Well maybe I’ll take history instead, at least I might learn something useful.”

Peter Radford

Long run demand effects

23 Apr, 2016 at 12:44 | Posted in Economics | 1 Comment

In the standard mainstream economic analysis — take a quick look in e.g. Mankiw’s or Krugman’s textbooks — a demand expansion may very well raise measured productivity in the short run. But in the long run, expansionary demand policy measures cannot lead to sustained higher productivity and output levels.

verdoornIn some non-standard heterodox analyses, however, labour productivity growth is often described as a function of output growth. The rate of technical progress varies directly with the rate of growth according to the Verdoorn law. Growth and productivity is in this view highly demand-determined, not only in the short run but also in the long run.

Given that the Verdoorn law is operative, expansionary economic policies actually may lead to increases in productivity and growth. Living in a world permeated by genuine Keynes-type uncertainty, we can, of course, not with any greater precision forecast how great those effects would be.

So, the nodal point is — has the Verdoorn Law been validated or not in empirical studies?

60274818There have been hundreds of studies that have tried to answer that question, and as could be imagined, the answers differ. The law has been investigated with different econometric methods (time-series, IV, OLS, ECM, cointegration, etc.). The statistical and econometric problems are enormous (especially when it comes to the question on the direction of causality). Given this, however, most studies on the country level do confirm that the Verdoorn law holds.

Conclusion: demand policy measures may have long run effects.

Axel Leijonhufvud on the road not taken

23 Apr, 2016 at 09:08 | Posted in Economics | Comments Off on Axel Leijonhufvud on the road not taken

The orthodox Keynesianism of the time did have a theoretical explanation for recessions and depressions. Proponents saw the economy as a self-regulating machine in which individual decisions typically lead to a situation of full employment and healthy growth. The primary reason for periods of recession and depression was because wages did not fall quickly enough. If wages could fall rapidly and extensively enough, then the economy would absorb the unemployed. Orthodox Keynesians also took Keynes’ approach to monetary economics to be similar to the classical economists.

axel_press_highresLeijonhufvud got something entirely different from reading the General Theory. The more he looked at his footnotes, originally written in puzzlement at the disparity between what he took to be the Keynesian message and the orthodox Keynesianism of his time, the confident he felt. The implications were amazing. Had the whole discipline catastrophically misunderstood Keynes’ deeply revolutionary ideas? Was the dominant economics paradigm deeply flawed and a fatally wrong turn in macroeconomic thinking? And if this was the case, what was Keynes actually proposing?

Leijonhufvud’s “Keynesian Economics and the Economics of Keynes” exploded onto the academic stage the following year; no mean feat for an economics book that did not contain a single equation. The book took no prisoners and aimed squarely at the prevailing metaphor about the self-regulating economy and the economics of the orthodoxy. He forcefully argued that the free movement of wages and prices can sometimes be destabilizing and could move the economy away from full employment.

Arjun Jayadev

A must-read (not least because of the interview videos where Leijonhufvud gets the opportunity to comment on the ‘madness’ of modern mainstream macroeconomics)!

If macroeconomic models — no matter of what ilk — build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Trying to represent real-world target systems with models flagrantly at odds with reality is futile. And if those models are New Classical or ‘New Keynesian’ makes very little difference.

So, indeed, there really is something about the way macroeconomists construct their models nowadays that obviously doesn’t sit right.

Fortunately — when you’ve got tired of the kind of macroeconomic apologetics produced by ‘New Keynesian’ macroeconomists — there are still some real Keynesian macroeconomists to read. One of them is Axel Leijonhufvud.

Macroeconomic just-so stories you really do not want to buy

22 Apr, 2016 at 17:30 | Posted in Economics | Comments Off on Macroeconomic just-so stories you really do not want to buy

pinnocThus your standard New Keynesian model will use Calvo pricing and model the current inflation rate as tightly coupled to the present value of expected future output gaps. Is this a requirement anyone really wants to put on the model intended to help us understand the world that actually exists out there? Thus your standard New Keynesian model will calculate the expected path of consumption as the solution to some Euler equation plus an intertemporal budget constraint, with current wealth and the projected real interest rate path as the only factors that matter. This is fine if you want to demonstrate that the model can produce macroeconomic pathologies. But is it a not-stupid thing to do if you want your model to fit reality?

I remember attending the first lecture in Tom Sargent’s evening macroeconomics class back when I was in undergraduate: very smart man from whom I have learned the enormous amount, and well deserving his Nobel Prize. But…

He said … we were going to build a rigorous, micro founded model of the demand for money: We would assume that everyone lived for two periods, worked in the first period when they were young and sold what they produced to the old, held money as they aged, and then when they were old use their money to buy the goods newly produced by the new generation of young. Tom called this “microfoundations” and thought it gave powerful insights into the demand for money that you could not get from money-in-the-utility-function models.

I thought that it was a just-so story, and that whatever insights it purchased for you were probably not things you really wanted to buy. I thought it was dangerous to presume that you understood something because you had “microfoundations” when those microfoundations were wrong. After all, Ptolemaic astronomy had microfoundations: Mercury moved more rapidly than Saturn because the Angel of Mercury left his wings more rapidly than the Angel of Saturn and because Mercury was lighter than Saturn…

Brad DeLong

Brad DeLong is of course absolutely right here, and one could only wish that other mainstream economists would listen to him …

Oxford macroeconomist Simon Wren-Lewis elaborates in a post on his blog on why he thinks the New Classical Counterrevolution was so successful in replacing older theories, despite the fact that the New Classical models were not able to explain what happened to output and inflation in the 1970s and 1980s:

The new theoretical ideas New Classical economists brought to the table were impressive, particularly to those just schooled in graduate micro. Rational expectations is the clearest example …

However, once the basics of New Keynesian theory had been established, it was quite possible to incorporate concepts like rational expectations or Ricardian Eqivalence into a traditional structural econometric model (SEM) …

The real problem with any attempt at synthesis is that a SEM is always going to be vulnerable to the key criticism in Lucas and Sargent, 1979: without a completely consistent microfounded theoretical base, there was the near certainty of inconsistency brought about by inappropriate identification restrictions …

So why does this matter? … If mainstream academic macroeconomists were seduced by anything, it was a methodology – a way of doing the subject which appeared closer to what at least some of their microeconomic colleagues were doing at the time, and which was very different to the methodology of macroeconomics before the New Classical Counterrevolution. The old methodology was eclectic and messy, juggling the competing claims of data and theory. The new methodology was rigorous!

Unlike Brad DeLong, Wren-Lewis seems to be impressed by the ‘rigour’ brought to macroeconomics by the New Classical counterrevolution and its rational expectations, microfoundations and ‘Lucas Critique’.

It is difficult to see why.

3634flimWren-Lewis’s ‘portrayal’ of rational expectations is not as innocent as it may look. Rational expectations in the neoclassical economists’s world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents ‘make predictable errors’ in the ‘New Keynesian’ models doesn’t take us a closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

‘Rigorous’ and ‘precise’ New Classical models — and that goes for the ‘New Keynesian’ variety too — cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.


No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about real world economies.

Proving things ‘rigorously’ in mathematical models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

The real tail wagging

20 Apr, 2016 at 19:32 | Posted in Economics | 3 Comments

dog-tailKeynes’s intellectual revolution was to shift economists from thinking normally in terms of a model of reality in which a dog called savings wagged his tail labelled investment to thinking in terms of a model in which a dog called investment wagged his tail labelled savings

James Meade

To my sons and daughters (personal)

19 Apr, 2016 at 14:55 | Posted in Economics, Varia | Comments Off on To my sons and daughters (personal)


At the end of every principle is a promise

This one is for you — David, Tora, Linnea, Amanda, and Sebastian

Pitfalls of meta-analysis

19 Apr, 2016 at 10:28 | Posted in Statistics & Econometrics | 1 Comment

Including all relevant material – good, bad, and indifferent – in meta-analysis admits the subjective judgments that meta-analysis was designed to avoid. Several problems arise in meta-analysis: regressions are often non -linear; effects are often multivariate rather than univariate; coverage can be restricted; bad studies may be included; the data summarised may not be homogeneous; grouping different causal factors may lead to meaningless estimates of effects; and the theory-directed approach may obscure discrepancies. Meta-analysis may not be the one best method for studying the diversity of fields for which it has been used …


Glass and Smith carried out a meta-analysis of research on class size and achievement and concluded that “a clear and strong relationship between class size and achievement has emerged.”10 The study was done and analysed well; it might almost be cited as an example of what meta-analysis can do. Yet the conclusion is very misleading, as is the estimate of effect size it presents: “between class-size of 40 pupils and one pupil lie more than 30 percentile ranks of achievement.” Such estimates imply a linear regression, yet the regression is extremely curvilinear, as one of the authors’ figures shows: between class sizes of 20 and 40 there is absolutely no difference in achievement; it is only with unusually small classes that there seems to be an effect. For a teacher the major result is that for 90% of all classes the number of pupils makes no difference at all to their achievement. The conclusions drawn by the authors from their meta-analysis are normally correct, but they are statistically meaningless and particularly misleading. No estimate of effect size is meaningful unless regressions are linear, yet such linearity is seldom investigated, or, if not present, taken seriously.

H J Eysenck

Systematic reviews in sciences are extremely important to undertake in our search for robust evidence and explanations — simply averaging data from different populations, places, and contexts, is not.

Physics and economics

17 Apr, 2016 at 12:15 | Posted in Economics | 5 Comments

cover_frontFrom the times of Galileo and Newton, physicists have learned not to confuse what is happening in the model with what instead is happening in reality. Physical models are compared with observations to prove if they are able to provide precise explanations … Can one argue that the use of mathematics in neoclassical economics serves similar purposes? … Gillies‘s conclusion is that, while in physics mathematics was used to obtain precise explanations and successful predictions, one cannot draw the same conclusion about the use of mathematics in neoclassical economics in the last half century. This analysis reinforces the conclusion about the pseudo-scientific nature of neoclassical economics … given the systematic failure of predictions of neoclassical economics.

Francesco Sylos Labini is a researcher in physics. His book is to be highly recommended reading to anyone with an interest in understanding the pseudo-scientific character of modern mainstream economics. Turning economics into a ‘pseudo-natural-science’ is — as Keynes made very clear in a letter to Roy Harrod already back in 1938 — something that has to be firmly ‘repelled.’

Robert Lucas’s forecasting disaster

16 Apr, 2016 at 20:20 | Posted in Economics | 5 Comments

In Milton Friedman’s infamous essay The Methodology of Positive Economics (1953) it was argued that the realism or ‘truth of a theory’s assumptions isn’t important. The only thing that really matters is how good are the predictions made by the theory.

Please feel free to apply that science norm to the following statement by Robert Lucas in Wall Street Journal, September 19, 2007:

uN7GNNaI am skeptical about the argument that the subprime mortgage problem will contaminate the whole mortgage market, that housing construction will come to a halt, and that the economy will slip into a recession. Every step in this chain is questionable and none has been quantified. If we have learned anything from the past 20 years it is that there is a lot of stability built into the real economy.

Robert Lucas — a lousy pseudo-scientist and an even worse forecaster!

Robert Lucas’s pseudo-science

16 Apr, 2016 at 09:54 | Posted in Economics | Comments Off on Robert Lucas’s pseudo-science

fraud-kitThe construction of theoretical models is our way to bring order to the way we think about the world, but the process necessarily involves ignoring some evidence or alternative theories – setting them aside. That can be hard to do – facts are facts – and sometimes my unconscious mind carries out the abstraction for me: I simply fail to see some of the data or some alternative theory.

Robert Lucas

And that guy even got a ‘Nobel prize’ in economics …

Economists — math-heavy astrologers

16 Apr, 2016 at 00:13 | Posted in Economics | Comments Off on Economists — math-heavy astrologers

QUALITIES OF AN ASTROLOGERUltimately, the problem isn’t with worshipping models of the stars, but rather with uncritical worship of the language used to model them, and nowhere is this more prevalent than in economics. The economist Paul Romer at New York University has recently begun calling attention to an issue he dubs ‘mathiness’ – first in the paper ‘Mathiness in the Theory of Economic Growth’ (2015) and then in a series of blog posts. Romer believes that macroeconomics, plagued by mathiness, is failing to progress as a true science should, and compares debates among economists to those between 16th-century advocates of heliocentrism and geocentrism. Mathematics, he acknowledges, can help economists to clarify their thinking and reasoning. But the ubiquity of mathematical theory in economics also has serious downsides: it creates a high barrier to entry for those who want to participate in the professional dialogue, and makes checking someone’s work excessively laborious. Worst of all, it imbues economic theory with unearned empirical authority.

‘I’ve come to the position that there should be a stronger bias against the use of math,’ Romer explained to me. ‘If somebody came and said: “Look, I have this Earth-changing insight about economics, but the only way I can express it is by making use of the quirks of the Latin language”, we’d say go to hell, unless they could convince us it was really essential. The burden of proof is on them.’

Right now, however, there is widespread bias in favour of using mathematics. The success of math-heavy disciplines such as physics and chemistry has granted mathematical formulas with decisive authoritative force. Lord Kelvin, the 19th-century mathematical physicist, expressed this quantitative obsession:

“When you can measure what you are speaking about and express it in numbers you know something about it; but when you cannot measure it… in numbers, your knowledge is of a meagre and unsatisfactory kind.”

The trouble with Kelvin’s statement is that measurement and mathematics do not guarantee the status of science – they guarantee only the semblance of science. When the presumptions or conclusions of a scientific theory are absurd or simply false, the theory ought to be questioned and, eventually, rejected. The discipline of economics, however, is presently so blinkered by the talismanic authority of mathematics that theories go overvalued and unchecked.

Alan Jay Levinovitz

Next Page »

Blog at
Entries and Comments feeds.

%d bloggers like this: