Krugman’s modeling flim flam

30 April, 2016 at 11:15 | Posted in Economics | 3 Comments

Paul Krugman has a piece up on his blog this week arguing that the ‘discipline of modeling’ is a sine qua non for tackling politically and emotionally charged economic issues:

You might say that the way to go about research is to approach issues with a pure heart and mind: seek the truth, and derive any policy conclusions afterwards. But that, I suspect, is rarely how things work. After all, the reason you study an issue at all is usually that you care about it, that there’s something you want to achieve or see happen. Motivation is always there; the trick is to do all you can to avoid motivated reasoning that validates what you want to hear.

economist-nakedIn my experience, modeling is a helpful tool (among others) in avoiding that trap, in being self-aware when you’re starting to let your desired conclusions dictate your analysis. Why? Because when you try to write down a model, it often seems to lead some place you weren’t expecting or wanting to go. And if you catch yourself fiddling with the model to get something else out of it, that should set off a little alarm in your brain.

Hmm …

So when Krugman and other ‘modern’ mainstream economists use their models — standardly assuming rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative agents with homothetic and identical preferences, etc. — and standardly ignoring complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc. — we are supposed to believe that this somehow helps them ‘to avoid motivated reasoning that validates what you want to hear.’

Yours truly  is, to say the least,straight jacket far from convinced. The alarm that sets off in my brain is that this, rather than being helpful for understanding real world economic issues, sounds more like an ill-advised plaidoyer for voluntarily taking on a methodological straight-jacket of unsubstantiated and known to be false assumptions.

Advertisements

‘New Keynesian’ schizophrenia

29 April, 2016 at 20:57 | Posted in Economics | Comments Off on ‘New Keynesian’ schizophrenia

Taking part of the debate on microfoundations among macroeconomists these days, I wonder if Heinz-Peter Spahn isn’t more on the right track than those who desperately offer more or less contrived defenses of the microfoundationalist programme:

reality_check2
The crucial point however is: market conditions, which are presupposed in the model of intertemporal choice, are not given in reality. Distributing consumption optimally over time depends on the possibility of individuals to lend money on their permanent income, if temporary periods of low market income are to be bridged. Because this perfect financial market does not exist, consumption behaviour necessarily depends  strongly on current income. Consumers know that their future expected income is distorted by spells of unemployment, the occurrence of which is hard to predict though; these quantity constraints are important also for firms …

Professional modern economics appear to suffer from schizophrenia as in the field of financial-market economics all these deviations from the Utopian ideal market are well known (information asymmetries etc.), which are stubbornly ignored when it comes to talk about macroeconomics in NKM [New Keynesian Macroeconomics]. The assumption of complete markets means that all agents’ intertemporal budget constraints always are satisfied, bankruptcies and insolvencies are impossible. The NKM world is populated by agents who never default …  Basically, NKM designs a non-monetary economy … Questions regarding financial instability cannot be answered within this models, they cannot even be asked …

NKM faces an uncomfortable trade-off. On the one hand, General Equilibrium Theory has shown that preferences and behaviour of heterogeneous agents cannot simply be aggregated. Variances between individuals matter! The Sonnenschein-Mantel-Debreu problem states that choices may not be transitive; the representative agent’s ranking differs from individual rankings; reactions to shock may be different … On the other hand, if people are assumed to be identical, NKM may keep the representative agent, but as a consequence the model has no interaction of agents, no distribution problems, no asymmetric information and no meaningful stock market.

The critique so far may appear as unfair as it neglects the various refinements that were proposed in order to develop and improve the basic model set-up … But these extensions of NKM – due to the Walrasian method – yield many precisely-looking results … but do not grasp the impact of bank credit on goods demand, market income and employment in a typical monetary economy.

Heinz-Peter Spahn

The money multiplier – neat, plausible, and utterly wrong

28 April, 2016 at 18:02 | Posted in Economics | 9 Comments

The neoclassical textbook concept of money multiplier assumes that banks automatically expand the credit money supply to a multiple of their aggregate reserves.  If the required currency-deposit reserve ratio is 5%, the money supply should be about twenty times larger than the aggregate reserves of banks.  In this way the money multiplier concept assumes that the central bank controls the money supply by setting the required reserve ratio.

In his Macroeconomics – just to take an example – Greg Mankiw writes:

We can now see that the money supply is proportional to the monetary base. The factor of proportionality … is called the money multiplier … Each dollar of the monetary base produces m dollars of money. Because the monetary base has a multiplied effect on the money supply, the monetary base is called high-powered money.

The money multiplier concept is – as can be seen from the quote above – nothing but one big fallacy. This is not the way credit is created in a monetary economy. It’s nothing but a monetary myth that the monetary base can play such a decisive role in a modern credit-run economy with fiat money.

In the real world banks first extend credits and then look for reserves. So the money multiplier basically also gets the causation wrong. At a deep fundamental level the supply of money is endogenous.

garbageOne may rightly wonder why on earth this pet neoclassical fairy tale is still in the textbooks and taught to economics undergraduates. Giving the impression that banks exist simply to passively transfer savings into investment, it is such a gross misrepresentation of what goes on in the real world, that there is only one place for it — and that is in the garbage can!

Economic rebellion

24 April, 2016 at 08:35 | Posted in Economics | 8 Comments

p03nxmlz

Listen to the program here.

Mainstream economists like Paul Krugman and Simon Wren-Lewis think that yours truly and other heterodox economists are wrong in blaming mainstream economics for not being real-world relevant and pluralist. To Krugman there is nothing wrong with ‘standard theory’ and ‘economics textbooks.’ If only policy makers and economists stick to ‘standard economic analysis’ everything would be just fine.

I’ll be dipped! If there’s anything the last decade has shown us, it is that economists have gone astray in their tool shed. Krugman’s ‘standard theory’ — mainstream neoclassical economics — has contributed to causing today’s economic crisis rather than to solving it.
Rethinking econ_0

A small ray of hope

23 April, 2016 at 16:14 | Posted in Economics | 3 Comments

a_ray_of_hope_by_kaslito-d5yo5t0

I overheard a conversation between two high school students this morning.

The first person was asking about which classes the second was going to take next. One of those mentioned was microeconomics.

“Oh, that’s easy” said the first, “You just have to remember that it’s all rubbish – they want you to believe that people are rational, and that there’s all this perfection in the world.”

“Really?” responded the second, “That’s really dumb. I wonder why they do that?”

“It doesn’t matter, it’s economics”

“Well maybe I’ll take history instead, at least I might learn something useful.”

Peter Radford

Long run demand effects

23 April, 2016 at 12:44 | Posted in Economics | 1 Comment

In the standard mainstream economic analysis — take a quick look in e.g. Mankiw’s or Krugman’s textbooks — a demand expansion may very well raise measured productivity in the short run. But in the long run, expansionary demand policy measures cannot lead to sustained higher productivity and output levels.

verdoornIn some non-standard heterodox analyses, however, labour productivity growth is often described as a function of output growth. The rate of technical progress varies directly with the rate of growth according to the Verdoorn law. Growth and productivity is in this view highly demand-determined, not only in the short run but also in the long run.

Given that the Verdoorn law is operative, expansionary economic policies actually may lead to increases in productivity and growth. Living in a world permeated by genuine Keynes-type uncertainty, we can, of course, not with any greater precision forecast how great those effects would be.

So, the nodal point is — has the Verdoorn Law been validated or not in empirical studies?

60274818There have been hundreds of studies that have tried to answer that question, and as could be imagined, the answers differ. The law has been investigated with different econometric methods (time-series, IV, OLS, ECM, cointegration, etc.). The statistical and econometric problems are enormous (especially when it comes to the question on the direction of causality). Given this, however, most studies on the country level do confirm that the Verdoorn law holds.

Conclusion: demand policy measures may have long run effects.

Axel Leijonhufvud on the road not taken

23 April, 2016 at 09:08 | Posted in Economics | Comments Off on Axel Leijonhufvud on the road not taken

The orthodox Keynesianism of the time did have a theoretical explanation for recessions and depressions. Proponents saw the economy as a self-regulating machine in which individual decisions typically lead to a situation of full employment and healthy growth. The primary reason for periods of recession and depression was because wages did not fall quickly enough. If wages could fall rapidly and extensively enough, then the economy would absorb the unemployed. Orthodox Keynesians also took Keynes’ approach to monetary economics to be similar to the classical economists.

axel_press_highresLeijonhufvud got something entirely different from reading the General Theory. The more he looked at his footnotes, originally written in puzzlement at the disparity between what he took to be the Keynesian message and the orthodox Keynesianism of his time, the confident he felt. The implications were amazing. Had the whole discipline catastrophically misunderstood Keynes’ deeply revolutionary ideas? Was the dominant economics paradigm deeply flawed and a fatally wrong turn in macroeconomic thinking? And if this was the case, what was Keynes actually proposing?

Leijonhufvud’s “Keynesian Economics and the Economics of Keynes” exploded onto the academic stage the following year; no mean feat for an economics book that did not contain a single equation. The book took no prisoners and aimed squarely at the prevailing metaphor about the self-regulating economy and the economics of the orthodoxy. He forcefully argued that the free movement of wages and prices can sometimes be destabilizing and could move the economy away from full employment.

Arjun Jayadev

A must-read (not least because of the interview videos where Leijonhufvud gets the opportunity to comment on the ‘madness’ of modern mainstream macroeconomics)!

If macroeconomic models — no matter of what ilk — build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Trying to represent real-world target systems with models flagrantly at odds with reality is futile. And if those models are New Classical or ‘New Keynesian’ makes very little difference.

So, indeed, there really is something about the way macroeconomists construct their models nowadays that obviously doesn’t sit right.

Fortunately — when you’ve got tired of the kind of macroeconomic apologetics produced by ‘New Keynesian’ macroeconomists — there are still some real Keynesian macroeconomists to read. One of them is Axel Leijonhufvud.

Macroeconomic just-so stories you really do not want to buy

22 April, 2016 at 17:30 | Posted in Economics | Comments Off on Macroeconomic just-so stories you really do not want to buy

pinnocThus your standard New Keynesian model will use Calvo pricing and model the current inflation rate as tightly coupled to the present value of expected future output gaps. Is this a requirement anyone really wants to put on the model intended to help us understand the world that actually exists out there? Thus your standard New Keynesian model will calculate the expected path of consumption as the solution to some Euler equation plus an intertemporal budget constraint, with current wealth and the projected real interest rate path as the only factors that matter. This is fine if you want to demonstrate that the model can produce macroeconomic pathologies. But is it a not-stupid thing to do if you want your model to fit reality?

I remember attending the first lecture in Tom Sargent’s evening macroeconomics class back when I was in undergraduate: very smart man from whom I have learned the enormous amount, and well deserving his Nobel Prize. But…

He said … we were going to build a rigorous, micro founded model of the demand for money: We would assume that everyone lived for two periods, worked in the first period when they were young and sold what they produced to the old, held money as they aged, and then when they were old use their money to buy the goods newly produced by the new generation of young. Tom called this “microfoundations” and thought it gave powerful insights into the demand for money that you could not get from money-in-the-utility-function models.

I thought that it was a just-so story, and that whatever insights it purchased for you were probably not things you really wanted to buy. I thought it was dangerous to presume that you understood something because you had “microfoundations” when those microfoundations were wrong. After all, Ptolemaic astronomy had microfoundations: Mercury moved more rapidly than Saturn because the Angel of Mercury left his wings more rapidly than the Angel of Saturn and because Mercury was lighter than Saturn…

Brad DeLong

Brad DeLong is of course absolutely right here, and one could only wish that other mainstream economists would listen to him …

Oxford macroeconomist Simon Wren-Lewis elaborates in a post on his blog on why he thinks the New Classical Counterrevolution was so successful in replacing older theories, despite the fact that the New Classical models were not able to explain what happened to output and inflation in the 1970s and 1980s:

The new theoretical ideas New Classical economists brought to the table were impressive, particularly to those just schooled in graduate micro. Rational expectations is the clearest example …

However, once the basics of New Keynesian theory had been established, it was quite possible to incorporate concepts like rational expectations or Ricardian Eqivalence into a traditional structural econometric model (SEM) …

The real problem with any attempt at synthesis is that a SEM is always going to be vulnerable to the key criticism in Lucas and Sargent, 1979: without a completely consistent microfounded theoretical base, there was the near certainty of inconsistency brought about by inappropriate identification restrictions …

So why does this matter? … If mainstream academic macroeconomists were seduced by anything, it was a methodology – a way of doing the subject which appeared closer to what at least some of their microeconomic colleagues were doing at the time, and which was very different to the methodology of macroeconomics before the New Classical Counterrevolution. The old methodology was eclectic and messy, juggling the competing claims of data and theory. The new methodology was rigorous!

Unlike Brad DeLong, Wren-Lewis seems to be impressed by the ‘rigour’ brought to macroeconomics by the New Classical counterrevolution and its rational expectations, microfoundations and ‘Lucas Critique’.

It is difficult to see why.

3634flimWren-Lewis’s ‘portrayal’ of rational expectations is not as innocent as it may look. Rational expectations in the neoclassical economists’s world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents ‘make predictable errors’ in the ‘New Keynesian’ models doesn’t take us a closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

‘Rigorous’ and ‘precise’ New Classical models — and that goes for the ‘New Keynesian’ variety too — cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

keynes-right-and-wrong

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about real world economies.

Proving things ‘rigorously’ in mathematical models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

The real tail wagging

20 April, 2016 at 19:32 | Posted in Economics | 3 Comments

dog-tailKeynes’s intellectual revolution was to shift economists from thinking normally in terms of a model of reality in which a dog called savings wagged his tail labelled investment to thinking in terms of a model in which a dog called investment wagged his tail labelled savings

James Meade

To my sons and daughters (personal)

19 April, 2016 at 14:55 | Posted in Economics, Varia | Comments Off on To my sons and daughters (personal)

 

At the end of every principle is a promise

This one is for you — David, Tora, Linnea, Amanda, and Sebastian

Pitfalls of meta-analysis

19 April, 2016 at 10:28 | Posted in Statistics & Econometrics | 1 Comment

Including all relevant material – good, bad, and indifferent – in meta-analysis admits the subjective judgments that meta-analysis was designed to avoid. Several problems arise in meta-analysis: regressions are often non -linear; effects are often multivariate rather than univariate; coverage can be restricted; bad studies may be included; the data summarised may not be homogeneous; grouping different causal factors may lead to meaningless estimates of effects; and the theory-directed approach may obscure discrepancies. Meta-analysis may not be the one best method for studying the diversity of fields for which it has been used …

gigo

Glass and Smith carried out a meta-analysis of research on class size and achievement and concluded that “a clear and strong relationship between class size and achievement has emerged.”10 The study was done and analysed well; it might almost be cited as an example of what meta-analysis can do. Yet the conclusion is very misleading, as is the estimate of effect size it presents: “between class-size of 40 pupils and one pupil lie more than 30 percentile ranks of achievement.” Such estimates imply a linear regression, yet the regression is extremely curvilinear, as one of the authors’ figures shows: between class sizes of 20 and 40 there is absolutely no difference in achievement; it is only with unusually small classes that there seems to be an effect. For a teacher the major result is that for 90% of all classes the number of pupils makes no difference at all to their achievement. The conclusions drawn by the authors from their meta-analysis are normally correct, but they are statistically meaningless and particularly misleading. No estimate of effect size is meaningful unless regressions are linear, yet such linearity is seldom investigated, or, if not present, taken seriously.

H J Eysenck

Systematic reviews in sciences are extremely important to undertake in our search for robust evidence and explanations — simply averaging data from different populations, places, and contexts, is not.

Physics and economics

17 April, 2016 at 12:15 | Posted in Economics | 5 Comments

cover_frontFrom the times of Galileo and Newton, physicists have learned not to confuse what is happening in the model with what instead is happening in reality. Physical models are compared with observations to prove if they are able to provide precise explanations … Can one argue that the use of mathematics in neoclassical economics serves similar purposes? … Gillies‘s conclusion is that, while in physics mathematics was used to obtain precise explanations and successful predictions, one cannot draw the same conclusion about the use of mathematics in neoclassical economics in the last half century. This analysis reinforces the conclusion about the pseudo-scientific nature of neoclassical economics … given the systematic failure of predictions of neoclassical economics.

Francesco Sylos Labini is a researcher in physics. His book is to be highly recommended reading to anyone with an interest in understanding the pseudo-scientific character of modern mainstream economics. Turning economics into a ‘pseudo-natural-science’ is — as Keynes made very clear in a letter to Roy Harrod already back in 1938 — something that has to be firmly ‘repelled.’

Robert Lucas’s forecasting disaster

16 April, 2016 at 20:20 | Posted in Economics | 5 Comments

In Milton Friedman’s infamous essay The Methodology of Positive Economics (1953) it was argued that the realism or ‘truth of a theory’s assumptions isn’t important. The only thing that really matters is how good are the predictions made by the theory.

Please feel free to apply that science norm to the following statement by Robert Lucas in Wall Street Journal, September 19, 2007:

uN7GNNaI am skeptical about the argument that the subprime mortgage problem will contaminate the whole mortgage market, that housing construction will come to a halt, and that the economy will slip into a recession. Every step in this chain is questionable and none has been quantified. If we have learned anything from the past 20 years it is that there is a lot of stability built into the real economy.
 
 

Robert Lucas — a lousy pseudo-scientist and an even worse forecaster!

Robert Lucas’s pseudo-science

16 April, 2016 at 09:54 | Posted in Economics | Comments Off on Robert Lucas’s pseudo-science

fraud-kitThe construction of theoretical models is our way to bring order to the way we think about the world, but the process necessarily involves ignoring some evidence or alternative theories – setting them aside. That can be hard to do – facts are facts – and sometimes my unconscious mind carries out the abstraction for me: I simply fail to see some of the data or some alternative theory.

Robert Lucas

And that guy even got a ‘Nobel prize’ in economics …

Economists — math-heavy astrologers

16 April, 2016 at 00:13 | Posted in Economics | Comments Off on Economists — math-heavy astrologers

QUALITIES OF AN ASTROLOGERUltimately, the problem isn’t with worshipping models of the stars, but rather with uncritical worship of the language used to model them, and nowhere is this more prevalent than in economics. The economist Paul Romer at New York University has recently begun calling attention to an issue he dubs ‘mathiness’ – first in the paper ‘Mathiness in the Theory of Economic Growth’ (2015) and then in a series of blog posts. Romer believes that macroeconomics, plagued by mathiness, is failing to progress as a true science should, and compares debates among economists to those between 16th-century advocates of heliocentrism and geocentrism. Mathematics, he acknowledges, can help economists to clarify their thinking and reasoning. But the ubiquity of mathematical theory in economics also has serious downsides: it creates a high barrier to entry for those who want to participate in the professional dialogue, and makes checking someone’s work excessively laborious. Worst of all, it imbues economic theory with unearned empirical authority.

‘I’ve come to the position that there should be a stronger bias against the use of math,’ Romer explained to me. ‘If somebody came and said: “Look, I have this Earth-changing insight about economics, but the only way I can express it is by making use of the quirks of the Latin language”, we’d say go to hell, unless they could convince us it was really essential. The burden of proof is on them.’

Right now, however, there is widespread bias in favour of using mathematics. The success of math-heavy disciplines such as physics and chemistry has granted mathematical formulas with decisive authoritative force. Lord Kelvin, the 19th-century mathematical physicist, expressed this quantitative obsession:

“When you can measure what you are speaking about and express it in numbers you know something about it; but when you cannot measure it… in numbers, your knowledge is of a meagre and unsatisfactory kind.”

The trouble with Kelvin’s statement is that measurement and mathematics do not guarantee the status of science – they guarantee only the semblance of science. When the presumptions or conclusions of a scientific theory are absurd or simply false, the theory ought to be questioned and, eventually, rejected. The discipline of economics, however, is presently so blinkered by the talismanic authority of mathematics that theories go overvalued and unchecked.

Alan Jay Levinovitz

You’re so vain (personal)

15 April, 2016 at 19:18 | Posted in Varia | Comments Off on You’re so vain (personal)

 

På silverfat (personal)

15 April, 2016 at 18:57 | Posted in Varia | Comments Off on På silverfat (personal)


Den här låten torterade jag min käre fader Uno med en hel vår för 40 år sedan, när jag insisterade på att den skulle på i bilstereon varje morgon på väg till arbete/skola. Jag var überjoyed — och han undrade hur ända in i helv… någon kunde komma på något så urbota dumt som ‘jag bjuder dig min kropp på SILVERFAT.’

Oh, om man ändå hade en tidsmaskin …

Reinhard Sippel’s modern classic

14 April, 2016 at 17:19 | Posted in Economics | Comments Off on Reinhard Sippel’s modern classic

timthumbThe experiment reported here was designed to reflect the fact that revealed preference theory is concerned with hypothetical choices rather than actual choices over time. In contrast to earlier experimental studies, the possibility that the different choices are made under different preference patterns can almost be ruled out. We find a considerable number of violations of the revealed preference axioms, which contradicts the neoclassical theory of the consumer maximising utility subject to a given budget constraint. We should therefore pay closer attention to the limits of this theory as a description of how people actually behave, i.e. as a positive theory of consumer behaviour. Recognising these limits, we economists should perhaps be a little more modest in our ‘imperialist ambitions’ of explaining non-market behaviour by economic principles.

Reinhard Sippel 

Sippel’s experiment showed considerable violations of the revealed preference axioms and that from a descriptive point of view — as a theory of consumer behaviour — the revealed preference theory was of a very limited value.

The neoclassical theory of consumer behaviour has been developed in great part as an attempt to justify the idea of a downward-sloping demand curve. What forerunners like e.g. Cournot (1838) and Cassel (1899) did was merely to assert this law of demand. The utility theorists tried to deduce it from axioms and postulates on individuals’ economic behaviour. Revealed preference theory — in the hands of Paul Samuelson and Hendrik Houthakker — tried to build a new theory and to put it in operational terms, but ended up with just giving a theory logically equivalent to the old one. As such it also shares its shortcomings of being empirically nonfalsifiable and of being based on unrestricted universal statements.

The theory is nothing but an empty tautology — and pondering on Reinhard Sippel’s experimental results and Nicholas Georgescu-Roegen’s apt description, a harsh assessment of what the theory accomplishes is inevitable:

analytLack of precise definition should not … disturb us in moral sciences, but improper concepts constructed by attributing to man faculties which he actually does not possess, should. And utility is such an improper concept … [P]erhaps, because of this impasse … some economists consider the approach offered by the theory of choice as a great progress … This is simply an illusion, because even though the postulates of the theory of choice do not use the terms ‘utility’ or ‘satisfaction’, their discussion and acceptance require that they should be translated into the other vocabulary … A good illustration of the above point is offered by the ingenious theory of the consumer constructed by Samuelson.

Why monetarism — and ‘New Keynesianism’ — failed

13 April, 2016 at 19:28 | Posted in Economics | 3 Comments

Paul Krugman has a post up today on why monetarism has more or less disappeared from economics nowadays. Milton Friedman’s project was, according to Krugman, doomed to failure. The key point for this argument is the following:

On the intellectual side, the “neoclassical synthesis” — of which Friedman-style monetarism was essentially part, despite his occasional efforts to make it seem completely different — was inherently an awkward construct. Economists were urged to build everything from “micro foundations” — which was taken to mean perfect rationality and clearing markets, not realistic descriptions of individual behavior. But to get a macro picture that looked anything like the real world, and which justified monetary activism, you needed to assume that for some reason wages and prices were slow to adjust.

Sounds familiar, doesn’t it? Yes, indeed, that is exactly what Krugman’s ‘New Keynesian’ buddies — Greg Mankiw, Olivier Blanchard, David Romer, Simon Wren-Lewis et consortes — are doing today!

So being consistent to his own argument, Krugman has to conclude that their project is ‘doomed to failure.’

Mirabile dictu!

Back in 1994 Laurence Ball and Greg Mankiw argued that

although traditionalists are often called ‘New Keynesians,’ this label is a misnomer. They could just as easily be called ‘New Monetarists.’

That is still true today — the macroeconomics of people like Greg Mankiw and Paul Krugman has theoretically and methodologically a lot more to do with Milton Friedman than with John Maynard Keynes.

Uncertainty — the crucial question

13 April, 2016 at 13:01 | Posted in Economics | Comments Off on Uncertainty — the crucial question

analytIt may be argued … that the betting quotient and credibility are substitutable in the same sense in which two commodities are: less bread but more meat may leave the consumer as well off as before. If this were, then clearly expectation could be reduced to a unidimensional concept … However, the substitutability of consumers’ goods rests upon the tacit assumption that all commodities contain something — called utility — in a greater or less degree; substitutability is therefore another name for compensation of utility. The crucial question in expectation then is whether credibility and betting quotient have a common essence so that compensation of this common essence would make sense.

 

Just like Keynes underlined with his concept of ‘weight of argument,’ Georgescu-Roegen, with his similar concept of ‘credibility,’ underlines the impossibility of reducing uncertainty to risk and thereby being able to describe  choice under uncertainty with a unidimensional probability concept.

In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known ‘data-generating process’ that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as both Georgescu-Roegen and Keynes convincingly argued, this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Georgescu-Roegen-Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only ‘rational’ one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

wrongrightSome macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing — economic disaster.

 

Post-keynesiansk nationalekonomi

13 April, 2016 at 11:09 | Posted in Economics | 4 Comments

Yours truly håller ett föredrag/seminarium onsdagen den 27 april kl. 15.30 i Hedénsalen på ABF-Stockholm (Sveavägen 41). Ämnet som avhandlas är

aVAD ÄR EN POST-KEYNESIANSK NATIONALEKONOMI?

Kom gärna och lyssna och diskutera.

Behovet av ökad pluralism inom nationalekonomin diskuteras numera intensivt bland ekonomer och studenter världen över. Så ta chansen att lära känna ett av de viktigare heterodoxa alternativen till den förhärskande neoklassiska teoribildningen inom den moderna nationalekonomin.

Aggregate production functions — neoclassical fairytales

12 April, 2016 at 18:33 | Posted in Economics | Comments Off on Aggregate production functions — neoclassical fairytales

When one works – as one must at an aggregate level – with quantities measured in value terms, the appearance of a well-behaved aggregate production function tells one nothing at all about whether there really is one. Such an appearance stems from the accounting identity that relates the value of outputs to the value of inputs – nothing more.

Frank%20300dpi-1v2All these facts should be well known. They are not, or, if they are, their implications are simply ignored by macroeconomists who go on treating the aggregate production function as the most fundamental construct of neoclassical macroeconomics …

The consequences of the non-existence of aggregate production functions have been too long overlooked. I am reminded of the story that, during World War II, a sign in an airplane manufacturing plant read: “The laws of aerodynamics tell us that the bumblebee cannot fly. But the bumblebee does fly, and, what is more, it makes a little honey each day.” I don’t know about bumblebees, but any honey supposedly made by aggregate production functions may well be bad for one’s health.

Attempts to explain the impossibility of using aggregate production functions in practice are often met with great hostility, even outright anger. To that I say … that the moral is: “Don’t interfere with fairytales if you want to live happily ever after.”

Franklin Fisher

Neoclassical production functions are fairytales generating pure fictional results. So why do mainstream economists still use these useless constructs? Probably because it’s tough for people to admit that what they have built their academic careers around is nothing but meaningless nonsense on stilts.

On minimum wage and value-free economics

12 April, 2016 at 13:44 | Posted in Economics | Comments Off on On minimum wage and value-free economics

I’ve subsequently stayed away from the minimum wage literature for a number of reasons. First, it cost me a lot of friends. People that I had known for many years, for instance, some of the ones I met at my first job at the University of Chicago, became very angry or disappointed. They thought that in publishing our work we were being traitors to the cause of economics as a whole.

David Card

Back in 1992, New Jersey raised the minimum wage by 18 per cent while its neighbour state, Pennsylvania, left its minimum wage unchanged. Unemployment in New Jersey should — according to mainstream economics textbooks — have increased relative to Pennsylvania. However, when economists David Card and Alan Krueger gathered information on fast food restaurants in the two states, it turned out that unemployment had actually decreased in New Jersey relative to that in Pennsylvania. Counter to mainstream demand theory we had an anomalous case of a backward-sloping supply curve.

Lo and behold!

But of course — when facts and theory don’t agree, it’s the facts that have to be wrong …

Mainstream economics — non-ideological and valuefree? I’ll be dipped!

copy-Reality-bats-last-final-blk-ledge

In the long run — economics as ideology

12 April, 2016 at 10:24 | Posted in Economics | Comments Off on In the long run — economics as ideology

capitalism-works-best

Although I never believed it when I was young and held scholars in great respect, it does seem to be the case that ideology plays a large role in economics. How else to explain Chicago’s acceptance of not only general equilibrium but a particularly simplified version of it as ‘true’ or as a good enough approximation to the truth? Or how to explain the belief that the only correct models are linear and that the von Neuman prices are those to which actual prices converge pretty smartly? This belief unites Chicago and the Classicals; both think that the ‘long-run’ is the appropriate period in which to carry out analysis. There is no empirical or theoretical proof of the correctness of this. But both camps want to make an ideological point. To my mind that is a pity since clearly it reduces the credibility of the subject and its practitioners.

Frank Hahn

Macroeconomic models — beautiful but irrelevant

11 April, 2016 at 12:08 | Posted in Economics | Comments Off on Macroeconomic models — beautiful but irrelevant

2-format2010Roman Frydman is Professor of Economics at New York University and a long time critic of the rational expectations hypothesis. In his seminal 1982 American Economic Review article Towards an Understanding of Market Processes: Individual Expectations, Learning, and Convergence to Rational Expectations Equilibrium — an absolute must-read for anyone with a serious interest in understanding what are the issues in the present discussion on rational expectations as a modeling assumption — he showed that macroeconomic models founded on the rational expectations hypothesis are inadequate as representation of economic agents’ decision making.

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As yours truly has tried to show in On the use and misuse of theories and models in economics there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong – in the dustbin of history.

For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? It is not enough being able to construct ‘beautiful’ models as long as they are irrelevant for explaining and understanding real world phenomena. The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand waving that give us rather little warrant for making inductive inferences from models to the real world. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

Contemporary economists’ reliance on mechanical rules to understand – and influence – economic outcomes extends to macroeconomic policy as well, and often draws on an authority, John Maynard Keynes, who would have rejected their approach. Keynes understood early on the fallacy of applying such mechanical rules. “We have involved ourselves in a colossal muddle,” he warned, “having blundered in the control of a delicate machine, the working of which we do not understand.”

To put it bluntly, the belief that an economist can fully specify in advance how aggregate outcomes – and thus the potential level of economic activity – unfold over time is bogus …

Roman Frydman & Michael Goldberg

The real macroeconomic challenge is to face and accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming rational expectations and treating uncertainty as if it was possible to reduce to stochastic risk. That is scientific cheating. And it has been going on for too long now.

‘Some are like water, some are like the heat’

11 April, 2016 at 09:41 | Posted in Varia | Comments Off on ‘Some are like water, some are like the heat’

 

When science becomes dogmatism

9 April, 2016 at 12:48 | Posted in Economics | 1 Comment

GW239H358Abstraction is the most valuable ladder of any science. In the social sciences, as Marx forcefully argued, it is all the more indispensable since there ‘the force of abstraction’ must compensate for the impossibility of using microscopes or chemical reactions. However, the task of science is not to climb up the easiest ladder and remain there forever distilling and redistilling the same pure stuff. Standard economics, by opposing any suggestions that the economic process may consist of something more than a jigsaw puzzle with all its elements given, has identified itself with dogmatism. And this is a privilegium odiosum that has dwarfed the understanding of the economic
process wherever it has been exercised.

Modern economics — an abstract monstrosity

8 April, 2016 at 16:46 | Posted in Economics | 1 Comment

The paradox of modern economics is that while the computers are churning out more and more figures, giving more and more spurious precision to economic pronouncements, the assumptions behind this fiesta of quantification are looking less and less safe. Economic model making was never easier to undertake and never more disconnected from reality.

howellSomewhere along the way economics took a wrong turn. What has occurred, and what has been vastly accentuated by the information revolution and its impact, is that economists have drained economic analysis both out of philosophy and out of real life, and have produced an abstract monstrosity, a world of models and assumptions increasingly disconnected from everyday experience and from discernible patterns of human behaviour, whether at the individual or the institutional level.

As a result, economists have not only failed to discern, explain or predict most of the ills which beset the world economy and society, but they have actively encouraged a deformity of perception amongst policy makers and communicators …

This misleading `black box’ view of the world purveyed by the economics profession (with heroic exceptions), at all levels from the most intimate micro workings of markets to the macro level of nation states and their jurisdictions, has been vastly reinforced by compliant statisticians who have brought a spurious precision and quantification to entities and concepts which may not in fact have any existence outside economic theory …

Yours truly on economics rules and deductivism

7 April, 2016 at 20:45 | Posted in Economics | Comments Off on Yours truly on economics rules and deductivism

Yours truly has two (!) articles in the latest issue (74) of Real-World Economics Review.

rodrikOne is a review of Dani Rodrik’s Economics Rules (Oxford University Press 2015) — When the model becomes the message — a critique of Rodrik  — on which I run a series of blogposts here in December last year.

Rodrik’s book is one of those rare examples where a mainstream economist — instead of just looking the other way — takes his time to ponder on the tough and deep science-theoretic and methodological questions that underpin the economics discipline.

There’s much in the book to like and appreciate, but there is also a very disturbing apologetic tendency to blame all of the shortcomings on the economists and depicting economics itself as a problem-free smorgasbord collection of models. If you just choose the appropriate model from the immense and varied smorgasbord there’s no problem. I sure wish it was that simple, but having written more than ten books on the history and methodology of economics, and having spent almost forty years ‘among them econs,’ I have to confess I don’t quite recognize the picture …

deductThe other article — Deductivism — the fundamental flaw of mainstream economics — argues that the more mainstream economics is aspiring to the ‘rigour’ and ‘precision’ of formal logic, the less it has to say about the real world. Although the formal logic focus may deepen our insights into the notion of validity, the rigour and precision has a devastatingly important trade-off: the higher the level of rigour and precision, the smaller is the range of real world application.

To read the other articles — by e.g. Thomas Palley, Alejandro Nadal, and Robert Locke — make sure to subscribe to RWER (click here).

Keynes’ critique of scientific atomism

7 April, 2016 at 19:12 | Posted in Theory of Science & Methodology | Comments Off on Keynes’ critique of scientific atomism

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be much less simple than the bare principle of uniformity. They appear to assume something much more like what mathematicians call the principle of the superposition of small effects, or, as I prefer to call it, in this connection, the atomic character of natural law. 3The system of the material universe must consist, if this kind of assumption is warranted, of bodies which we may term (without any implication as to their size being conveyed thereby) legal atoms, such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state. We do not have an invariable relation between particular bodies, but nevertheless each has on the others its own separate and invariable effect, which does not change with changing circumstances, although, of course, the total effect may be changed to almost any extent if all the other accompanying causes are different. Each atom can, according to this theory, be treated as a separate cause and does not enter into different organic combinations in each of which it is regulated by different laws …

The scientist wishes, in fact, to assume that the occurrence of a phenomenon which has appeared as part of a more complex phenomenon, may be some reason for expecting it to be associated on another occasion with part of the same complex. Yet if different wholes were subject to laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts. Given, on the other hand, a number of legally atomic units and the laws connecting them, it would be possible to deduce their effects pro tanto without an exhaustive knowledge of all the coexisting circumstances.

Keynes’ incisive critique is of course of interest in general for all sciences, but I think it is also of special interest in economics as a background to much of Keynes’ doubts about inferential statistics and econometrics.

Since econometrics doesn’t content itself with only making ‘optimal predictions’ but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions. Most important of these are the ‘atomistic’ assumptions of additivity and linearity.

overconfidenceThese assumptions — as underlined by Keynes — are of paramount importance and ought to be much more argued for — on both epistemological and ontological grounds — if at all being used.

Limiting model assumptions in economic science always have to be closely examined since if we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we ‘export’ them to our ‘target systems,’ we have to be able to show that they do not only hold under ceteris paribus conditions and a fortiori only are of limited value to our understanding, explanations or predictions of real economic systems.

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics like Keynes — and yours truly — will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, and a rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations.

But — real world social systems are not governed by stable causal mechanisms or capacities. The kinds of ‘laws’ and relations that econometrics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. As Keynes argued, when causal mechanisms operate in the real world they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it as a rule only because we engineered them for that purpose. Outside man-made ‘nomological machines’ they are rare, or even non-existant.

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.