Economics – a science with amnesia

29 Dec, 2012 at 12:31 | Posted in Economics | Comments Off on Economics – a science with amnesia

A couple of days ago yours truly had a piece on why mainstream economists have tended to go astray in their tool-sheds and actually thereby have contributed to causing today’s economic crisis rather than to solving it.

J Bradford DeLong – professor of economics at Berkeley – writes on a related theme on Project Syndicate:

It is the scale of the catastrophe that astonishes me. But what astonishes me even more is the apparent failure of academic economics to take steps to prepare itself for the future. “We need to change our hiring patterns,” I expected to hear economics departments around the world say in the wake of the crisis.

The fact is that we need fewer efficient-markets theorists and more people who work on microstructure, limits to arbitrage, and cognitive biases. We need fewer equilibrium business-cycle theorists and more old-fashioned Keynesians and monetarists. We need more monetary historians and historians of economic thought and fewer model-builders …


Yet that is not what economics departments are saying nowadays.

Perhaps I am missing what is really going on. Perhaps economics departments are reorienting themselves after the Great Recession in a way similar to how they reoriented themselves in a monetarist direction after the inflation of the 1970’s. But if I am missing some big change that is taking place, I would like somebody to show it to me.

Perhaps academic economics departments will lose mindshare and influence to others – from business schools and public-policy programs to political science, psychology, and sociology departments. As university chancellors and students demand relevance and utility, perhaps these colleagues will take over teaching how the economy works and leave academic economists in a rump discipline that merely teaches the theory of logical choice.

Or perhaps economics will remain a discipline that forgets most of what it once knew and allows itself to be continually distracted, confused, and in denial. If that were that to happen, we would all be worse off.

Resisting intuition

28 Dec, 2012 at 13:40 | Posted in Varia | 6 Comments

One of the main functions of System 2 is to monitor and control thoughts and actions “suggested” by System 1 … For an example, here is a simple puzzle. Do not try to solve it but listen to your intuition:

     A bat and ball cost $1.10.

     The bat costs one dollar more than the ball.

     How much does the ball cost?

A number came to your mind. The number, of course, is 10: 10 cents. The distinctive mark of this easy puzzle is that it evokes an answer that is intuitive, appealing, and wrong … The right answer is 5 cents.

Annie “Margaret Thatcher” Lööf i välförtjänt utförsbacke

27 Dec, 2012 at 12:09 | Posted in Politics & Society | 5 Comments

annielööfEnligt en Sifo-under-sökning beställd av Aftonbladet rasar nu förtroendet för center-ledaren Annie Lööf.

Nu har bara var femte väljare stort eller mycket stort förtroende för henne.

För ett år sedan var det nästan en tredjedel

Förvånande? Knappast – för få svenskar tänder på vad fru Lööf argumenterat och motionerat för på senare år:

Inför plattskatt (lägre skatt för höginkomsttagare)
Avskaffa lagen om anställningsskydd
Inskränk strejkrätten
Inför marknadshyror
Sälj ut SvT och SR
Sverige bör gå med i NATO
Bygg ut kärnkraften

Med en sådan politisk agenda är det naturligt att centerns alla väljare snart får plats på Stureplan.

Verkligheten börjar nu komma ikapp vår egen Margaret Thatcher. Det börjar dra ihop sig till ett uppvaknande ur den nyliberala mardröm denna politiska broiler och klyschmakare lyckats dra ner det en gång så stolta centerpartiet i …


25 Dec, 2012 at 14:00 | Posted in Varia | Comments Off on Juloratoriet

Bedövande vackert och nästintill outhärdligt smärtsamt berörande.
Stefan Nilsson har skrivit musiken till filmatiseringen av Göran Tunströms episka mästerverk Juloratoriet.

The Angels’ Share

25 Dec, 2012 at 10:55 | Posted in Varia | 3 Comments


Little Drummer Boy

24 Dec, 2012 at 11:53 | Posted in Varia | Comments Off on Little Drummer Boy


Jussi Björling – O Holy Night

23 Dec, 2012 at 18:45 | Posted in Varia | 1 Comment


Why econometrics still hasn’t delivered (wonkish)

23 Dec, 2012 at 12:04 | Posted in Statistics & Econometrics | 3 Comments

In the article The Scientific Model of Causality renowned econometrician and Nobel laureate James Heckman writes (emphasis added):

 A model is a set of possible counterfactual worlds constructed under some rules. The rules may be laws of physics, the consequences of utility maximization, or the rules governing social interactions … A model is in the mind. As a consequence, causality is in the mind.

Even though this is a standard view among econometricians, it’s – at least from a realist point of view – rather untenable. The reason we as scientists are interested in causality is that it’s a part of the way the world works. We represent the workings of causality in the real world by means of models, but that doesn’t mean that causality isn’t a fact pertaining to relations and structures that exist in the real world. If it was only “in the mind,” most of us couldn’t care less.

icebergsThe reason behind Heckman’s and most other econometricians’ nominalist-positivist view of science and models, is the belief that science can only deal with observable regularity patterns of a more or less lawlike kind. Only data matters and trying to (ontologically) go beyond observed data in search of the underlying real factors and relations that generate the data is not admissable. All has to take place in the econometric mind’s model since the real factors and relations according to the econometric (epistemologically based) methodology are beyond reach since they allegedly are both unobservable and unmeasurable. This also means that instead of treating the model-based findings as interesting clues for digging deepeer into real structures and mechanisms, they are treated as the end points of the investigation. Or as Asad Zaman puts it in Methodological Mistakes and Econometric Consequences:

Instead of taking it as a first step, as a clue to explore, conventional econometric methodology terminates at the discovery of a good fit … Conventional econometric methodology is a failure because it is merely an attempt to find patterns in the data, without any tools to assess whether or not the given pattern reflects some real forces which shape the data.

The critique put forward here is in line with what mathematical statistician David Freedman writes in  Statistical Models and Causal Inference (2010):

In my view, regression models are not a particularly good way of doing empirical work in the social sciences today, because the technique depends on knowledge that we do not have. Investigators who use the technique are not paying adequate attention to the connection – if any – between the models and the phenomena they are studying. Their conclusions may be valid for the computer code they have created, but the claims are hard to transfer from that microcosm to the larger world …

Given the limits to present knowledge, I doubt that models can be rescued by technical fixes. Arguments about the theoretical merit of regression or the asymptotic behavior of specification tests for picking one version of a model over another seem like the arguments about how to build desalination plants with cold fusion and the energy source. The concept may be admirable, the technical details may be fascinating, but thirsty people should look elsewhere …

Causal inference from observational data presents may difficulties, especially when underlying mechanisms are poorly understood. There is a natural desire to substitute intellectual capital for labor, and an equally natural preference for system and rigor over methods that seem more haphazard. These are possible explanations for the current popularity of statistical models.

Indeed, far-reaching claims have been made for the superiority of a quantitative template that depends on modeling – by those who manage to ignore the far-reaching assumptions behind the models. However, the assumptions often turn out to be unsupported by the data. If so, the rigor of advanced quantitative methods is a matter of appearance rather than substance.

Econometrics is  basically a deductive method. Given  the assumptions (such as manipulability, transitivity, Reichenbach probability principles, separability, additivity, linearity etc) it delivers deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. Real target systems are seldom epistemically isomorphic to axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of  the conclusions reached from within these epistemically convenient models/systems. Causal evidence generated by statistical/econometric procedures like regression analysis may be valid in “closed” models, but what we usually are interested in, is causal evidence in the real target system we happen to live in.

Most advocates of econometrics and regression analysis want  to have deductively automated answers to  fundamental causal questions. Econometricians think – as David Hendry expressed it in Econometrics – alchemy or science? (1980) –  they “have found their Philosophers’ Stone; it is called regression analysis and is used for transforming data into ‘significant results!'” But as David Freedman poignantly notes in Statistical Models: “Taking assumptions for granted is what makes statistical techniques into philosophers’ stones.” To apply “thin” methods we have to have “thick” background knowledge of  what’s going on in the real world, and not in idealized models. Conclusions  can only be as certain as their premises – and that also applies to the quest for causality in econometrics and regression analysis.

Without requirements of depth, explanations most often do not have practical significance. Only if we search for and find fundamental structural causes, can we hopefully also take effective measures to remedy problems like e.g. unemployment, poverty, discrimination and underdevelopment. A social science must try to establish what relations exist between different phenomena and the systematic forces that operate within the different realms of reality. If econometrics is to progress, it has to abandon its outdated nominalist-positivist view of science and the belief that science can only deal with observable regularity patterns of a more or less law-like kind. Scientific theories ought to do more than just describe event-regularities and patterns – they also have to analyze and describe the mechanisms, structures, and processes that give birth to these patterns and eventual regularities.

Sporadic blogging

22 Dec, 2012 at 12:35 | Posted in Varia | Comments Off on Sporadic blogging

Christmas is here again – and with five kids in the family, blogging can’t have top priority. Regular blogging will be resumed late next week.

Winter is not my season, so I’m already longing for when the view from my library once again looks like this:

Dutch books and money pumps (wonkish)

22 Dec, 2012 at 10:43 | Posted in Statistics & Econometrics | 4 Comments

Neoclassical economics nowadays usually assumes that agents that have to make choices under conditions of uncertainty behave according to Bayesian rules (preferably the ones axiomatized by Ramsey (1931), de Finetti (1937) or Savage (1954)) – that is, they maximize expected utility with respect to some subjective probability measure that is continually updated according to Bayes theorem. If not, they are supposed to be irrational, and ultimately – via some “Dutch book” or “money pump” argument – susceptible to being ruined by some clever “bookie”.
Bayesianism reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but – even granted this questionable reductionism – do rational agents really have to be Bayesian? As I have been arguing elsewhere (e. g. here and here) there is no strong warrant for believing so, but in this post I want to make a point on the informational requirement that the economic ilk of Bayesianism presupposes.

In many of the situations that are relevant to economics one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind, and that in those situations it is not really possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.

Say you have come to learn (based on own experience and tons of data) that the probability of you becoming unemployed in Sweden is 10%. Having moved to another country (where you have no own experience and no data) you have no information on unemployment and a fortiori nothing to help you construct any probability estimate on. A Bayesian would, however, argue that you would have to assign probabilities to the mutually exclusive alternative outcomes and that these have to add up to 1, if you are rational. That is, in this case – and based on symmetry – a rational individual would have to assign probability 10% to becoming unemployed and 90% of becoming employed.

That feels intuitively wrong though, and I guess most people would agree. Bayesianism cannot distinguish between symmetry-based probabilities from information and symmetry-based probabilities from an absence of information. In these kinds of situations most of us would rather say that it is simply irrational to be a Bayesian and better instead to admit that we “simply do not know” or that we feel ambiguous and undecided. Arbitrary an ungrounded probability claims are more irrational than being undecided in face of genuine uncertainty, so if there is not sufficient information to ground a probability distribution it is better to acknowledge that simpliciter, rather than pretending to possess a certitude that we simply do not possess.

I think this critique of Bayesianism is in accordance with the views of John Maynard Keynes’s A Treatise on Probability (1921) and General Theory (1937). According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but rational expectations. Sometimes we “simply do not know.” Keynes would not have accepted the view of Bayesian economists, according to whom expectations “tend to be distributed, for the same information set, about the prediction of the theory.” Keynes, rather, thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief”, beliefs that have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents modeled by Bayesian economists.

Money and finance

21 Dec, 2012 at 17:44 | Posted in Economics | 9 Comments


Oh dear, oh dear, Krugman gets it so wrong, so wrong, on the state of macroeconomics

21 Dec, 2012 at 11:30 | Posted in Economics, Theory of Science & Methodology | 2 Comments

Back in 1938 Keynes wrote in a letter to Harrod:

Economics is a science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world. It is compelled to be this, because, unlike the typical natural science, the material to which it is applied is, in too many respects, not homogeneous through time. The object of a model is to segregate the semi-permanent or relatively constant factors from those which are transitory or fluctuating so as to develop a logical way of thinking about the latter, and of understanding the time sequences to which they give rise in particular cases … Good economists are scarce because the gift for using “vigilant observation” to choose good models, although it does not require a highly specialised intellectual technique, appears to be a very rare one.

I came to think of this passage when I read “sort of  New Keynesian” economist Paul Krugman’s blog yesterday. Krugman  weighs in on the ongoing discussion on the state of macro, arguing that even though he and other “sort of New Keynesian” macroeconomists use the same “equipment” as RBC-New-Classical-freshwater macroeconomists, he resents the allegation that they are a fortiori sharing the the same endeavour. Krugman writes:

The real test came when the financial crisis struck, and pretty much to a man freshwater economists not only argued against fiscal stimulus — which is a defensible position — but insisted that there was no possible way to justify stimulus, that such ideas had been refuted and that “nobody” believed in them anymore  … I’m not saying that the [“New Keynesian”] NK approach is necessarily right; but it’s a serious intellectual effort, undertaken by people who thought they were part of an open professional dialogue. Oh, and there’s a lot of evidence for the price stickiness that is central to NK models; again, maybe it doesn’t mean what the theorists think, but surely that evidence ought to be part of any discussion.

Here we get a view that all macroeconomists more or less share the same (mainstream neoclassical) basic theory and “techniques”, so when we discuss and argue it’s only about which special assumptions we choose to make (sticky wages or not). But people like Hyman Minsky, Michal Kalecki, Sidney Weintraub, Johan Åkerman, Gunnar Myrdal, Paul Davidson, Axel Leijonhufvud – and yours truly – do not share any theory or models with Real Business Cycle theorists and  “sort of New Keynesians” like Greg Mankiw or Paul Krugman.

It’s nice to see that Krugman explicitly acknowledges what I have argued for many years now – “New Keynesian” macroeconomic models are at heart based on the modelling strategy of RBC and DSGE – representative agents, rational expectations, equilibrium and all that. And yes, they do have some minor idiosyncracies like “menu costs,” “price rigidities” and “sticky wages.” But the differencies are not really that fundamental. The basic model assumptions are the same.

Talking of Krugman, this really shouldn’t come as a surprise. In 1996 Krugman was invited to speak to the European Association for Evolutionary Political Economy. So here – right from the horse’s mouth – I quote from the speech (emphasis added):

I like to think that I am more open-minded about alternative approaches to economics than most, but I am basically a maximization-and-equilibrium kind of guy. Indeed, I am quite fanatical about defending the relevance of standard economic models in many situations …

I may have more sympathy for standard economics than most of you. My criticisms are those of someone who loves the field and has seen that affection repaid. I don’t know if that makes me morally better or worse than someone who criticizes from outside, but anyway it makes me different …

To me, it seems that what we know as economics is the study of those phenomena that can be understood as emerging from the interactions among intelligent, self-interested individuals … Personally, I consider myself a proud neoclassicist. By this I clearly don’t mean that I believe in perfect competition all the way. What I mean is that I prefer, when I can, to make sense of the world using models in which individuals maximize and the interaction of these individuals can be summarized by some concept of equilibrium. The reason I like that kind of model is not that I believe it to be literally true, but that I am intensely aware of the power of maximization-and-equilibrium to organize one’s thinking … 

If anything, this shows what a gross misnomer “New Keynesianism” is. The macroeconomic modelling strategy of people like Greg Mankiw and Paul Krugman has a lot to do with Robert Lucas and Thomas Sargent – and very little, or next to nothing, to do with the founder of macroeconomics, John Maynard Keynes. “New Keynesian” macroeconomic models build on Real Business Cycle foundations,  regularly assuming representative actors, rational expectations, market clearing and equilibrium. But if we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Macroeconomic theorists – regardless of being New Monetarist, New Classical or ”New Keynesian” – ought to do some ontological reflection and heed Keynes’ warnings on using thought-models in economics:

The object of our analysis is, not to provide a machine, or method of blind manipulation, which will furnish an infallible answer, but to provide ourselves with an organized and orderly method of thinking out particular problems; and, after we have reached a provisional conclusion by isolating the complicating factors one by one, we then have to go back on ourselves and allow, as well as we can, for the probable interactions of the factors amongst themselves. This is the nature of economic thinking. Any other way of applying our formal principles of thought (without which, however, we shall be lost in the wood) will lead us into error.

For those of us who have not forgotten the history of our discipline, and not bought the freshwater nursery tale of Lucas et consortes that Keynes was not “serious thinking,” we can easily see that there exists a macroeconomic tradition inspired by Keynes – a tradition that has absolutely nothing to do with any New Synthesis or “New Keynesianism” to do. Its ultimate building-block is the perception of genuine uncertainty and that people often “simply do not know.” Real actors can’t know everything and their acts and decisions are not simply possible to sum or aggregate without the economist risking to succumb to “the fallacy of composition”.

Instead of basing macroeconomics on unreal and unwarranted generalizations of microeconomic behaviour and relations, it is far better to accept the ontological fact that the future to a large extent is uncertain, and rather conduct macroeconomics on this fact of reality.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming uncertainty to be reducible to stochastic risk. That is scientific cheating. And it has been going on for too long now.

The Keynes-inspired building-blocks are there. But it is admittedly a long way to go before the whole construction is in place. But the sooner we are intellectually honest and ready to admit that “modern” neoclassical macroeconomics – “New Keynesian” or not – has come to way’s end – the sooner we can redirect are aspirations and knowledge in more fruitful endeavours.

Building models based on an “equipment” that assumes the equivalent of portraying people as being green and coming from Mars is not a sound foundation. There has to be better ways to optimize are time and scientific endeavours than spending hours and hours working through or constructing irrelevant “New Keynesian” RBC and DSGE macroeconomic models. I would rather recommend macroeconomists reallocating their time and endeavours into constructing better, real and relevant macroeconomic models – models that really help us to explain and understand reality.

Macroeconomics wars and rational expectations

20 Dec, 2012 at 14:36 | Posted in Economics | 3 Comments

In the latest issue of Real-World Economics Review (December 2012) yours truly has a paper on the Rational Expectations Hypothesis – Rational expectations – a fallacious foundation for macroeconomics in a non-ergodic world. Similar critique of REH is put forward in a recent book by Roman Frydman and Michael Goldberg:

Beyond_Mechanical_MarketsThe belief in the scientific stature of fully predetermined models, and in the adequacy of the Rational Expectations Hypothesis to portray how rational individuals think about the future, extends well beyond asset markets. Some economists go as far as to argue that the logical consistency that obtains when this hypothesis is imposed in fully predetermined models is a precondition of the ability of economic analysis to portray rationality and truth.

For example, in a well-known article published in The New York Times Magazine in September 2009, Paul Krugman (2009, p. 36) argued that Chicago-school free-market theorists “mistook beauty . . . for truth.” One of the leading Chicago economists, John Cochrane (2009, p. 4), responded that “logical consistency and plausible foundations are indeed ‘beautiful’ but to me they are also basic preconditions for ‘truth.’” Of course, what Cochrane meant by plausible foundations were fully predetermined Rational Expectations models. But, given the fundamental flaws of fully predetermined models, focusing on their logical consistency or inconsistency, let alone that of the Rational Expectations Hypothesis itself, can hardly be considered relevant to a discussion of the basic preconditions for truth in economic analysis, whatever “truth” might mean.

There is an irony in the debate between Krugman and Cochrane. Although the New Keynesian and behavioral models, which Krugman favors, differ in terms of their specific assumptions, they are every bit as mechanical as those of the Chicago orthodoxy. Moreover, these approaches presume that the Rational Expectations Hypothesis provides the standard by which to define rationality and irrationality.

In fact, the Rational Expectations Hypothesis requires no assumptions about the intelligence of market participants whatsoever … Rather than imputing superhuman cognitive and computational abilities to individuals, the hypothesis presumes just the opposite: market participants forgo using whatever cognitive abilities they do have. The Rational Expectations Hypothesis supposes that individuals do not engage actively and creatively in revising the way they think about the future. Instead, they are presumed to adhere steadfastly to a single mechanical forecasting strategy at all times and in all circumstances. Thus, contrary to widespread belief, in the context of real-world markets, the Rational Expectations Hypothesis has no connection to how even minimally reasonable profit-seeking individuals forecast the future in real-world markets. When new relationships begin driving asset prices, they supposedly look the other way, and thus either abjure profit-seeking behavior altogether or forgo profit opportunities that are in plain sight.

Roman Frydman & Michael Goldberg: Beyond Mechanical Markets

Centern – ett parti för nyliberala byfånar på Stureplan

19 Dec, 2012 at 11:22 | Posted in Politics & Society | 3 Comments

Annie LššfI dag presenterade Centern ett förslag till nytt partiprogram. Det är utarbetat av en arbetsgrupp och kommer att behandlas av både partiets styrelse och stämma.

Programmet är välskrivet, engagerande och försöker vara ideologiskt renlärigt. Centerpartiet kallar sig liberalt men frågan är om inte libertarianskt vore en bättre beteckning.

Ett exempel är familjepolitiken:

“Politiken bör varken avgöra hur många människor man får leva tillsammans med, gifta sig med eller vem som ska ärva ens tillgångar.”

Konkret vill man alltså införa månggifte och ta bort den laglott som gör att barn har rätt att ärva hälften av sina föräldrars egendom.

Centerpartiet öppnar även för att ta bort skolplikten, avskaffa anställningsskyddet, införa fri invandring och för att Sverige och EU ska styras federalt. Som USA.

Programmet rör sig i en politisk idévärld där skatt i grunden är stöld och staten existerar eftersom medborgarna avstått makt i utbyte mot beskydd.

En gång i tiden inskränkte sig Centerpartiets ideologi till vilken gata som skulle lagas eller hur man skulle få högre subventioner för mjölken. Nu har det slagit över åt andra hållet.

Förslaget till partiprogram är politisk teori på hög nivå, de stora filosoferna talar mellan raderna, men som praktisk politik är det helt verklighetsfrämmande.

Går man till de politiska ungdomsförbunden hittar man ofta liknande program, välskrivna, engagerande och ideologiskt renläriga.

Sen växer ungdomsklubbisterna upp och lär sig att politiken faktiskt inte fungerar som i studiecirklarna från ABF eller Medborgarskolan.

Utom i Centerpartiet, där blir de partiordförande.

Anders Lindberg (Aftonbladet)

Markov’s Inequality

18 Dec, 2012 at 20:02 | Posted in Statistics & Econometrics | 2 Comments

One of the most beautiful results of probability theory is Markov’s inequality (after the Russian mathematician Andrei Markov (1856-1922)):

If X is a non-negative stochastic variable (X ≥ 0) with a finite expectation value E(X), then for every a > 0

P{X ≥ a} ≤ E(X)/a

If the production of cars in a factory during a week is assumed to be a stochastic variable with an expectation value (mean) of 50 units, we can – based on nothing else but the inequality – conclude that the probability that the production for a week would be greater than 100 units can not exceed 50% [P(X≥100)≤(50/100)=0.5 = 50%]

I still feel a humble awe at this immensely powerful result. Without knowing anything else but an expected value (mean) of a probability distribution we can deduce upper limits for probabilities. The result hits me as equally suprising today as thirty years ago when I first run into it as a student of mathematical statistics.

[For a derivation of the inequality, see e.g. Sheldon Ross, Introduction to Probability and Statistics for Engineers and Scientists, Academic Press, 2009, s 129]

Next Page »

Blog at
Entries and comments feeds.