Keynes vs. Samuelson on models

31 January, 2016 at 14:44 | Posted in Economics | 1 Comment

411WDSW5BRL._SX331_BO1,204,203,200_To his credit Keynes was not, in contrast to Samuelson, a formalist who was committed to mathematical economics. Keynes wanted models, but for him, building them required ‘ a vigilant observation of the actual working of our system.’ Indeed, ‘to convert a model into a quantitative formula is to destroy its usefulness as an instrument of thought.’ That conclusion can be strongly endorsed!


Economics — still in the land of Mordor

30 January, 2016 at 12:00 | Posted in Economics | Comments Off on Economics — still in the land of Mordor

When it comes to my economics training, I’m a late bloomer. My primary training is in evolutionary theory, which I have used as a navigational guide to study many human-related topics, such as religion. But I didn’t tackle economics until 2008 …

At the time I had no way to answer this question. Economic jargon mystified me—an embarrassing confession, since I am fully at home with mathematical and computer simulation models. Economists were very smart, very powerful, and they spoke a language that I didn’t understand. They won Nobel Prizes.

Nevertheless, I had faith that evolution could say something important about the regulatory systems that economists preside over, even if I did not yet know the details …

Fortunately, I had a Fellowship of the Ring to rely upon … Some of my closest colleagues are highly respected economists, Herbert Gintis, Samuel Bowles, and Ernst Fehr …

I already knew from their work that the main body of modern economics, called neoclassical economics, was being challenged by a new school of thought called experimental and behavioural economics …

63139459I was disappointed. My colleagues such as Herb, Sam, and Ernst confirmed my own impression: They appreciated the relevance of evolution but were a tiny minority among behavioral and experimental economists, who in turn were a tiny minority among neoclassical economists …

The more I learned about economics, the more I discovered a landscape that is surpassingly strange. Like the land of Mordor, it is dominated by a single theoretical edifice that arose like a volcano early in the 20th century and still dominates the landscape. The edifice is based upon a conception of human nature that is profoundly false, defying the dictates of common sense, before we even get to the more refined dictates of psychology and evolutionary theory. Yet, efforts to move the theory in the direction of common sense are stubbornly resisted.

David Sloan Wilson

[h/t Tom Hickey]

Good advice

30 January, 2016 at 11:31 | Posted in Varia | Comments Off on Good advice

‘If you really want something, you have to be prepared to work very hard, take advantage of opportunity, and above all — never give up.’

[h/t Ulrika Hall]

At the age of thirty-seven

29 January, 2016 at 21:34 | Posted in Varia | Comments Off on At the age of thirty-seven


Still absolutely breathtakingly great!

LOGIC of science vs. METHODS of science

29 January, 2016 at 17:19 | Posted in Theory of Science & Methodology | Comments Off on LOGIC of science vs. METHODS of science


Manfred Mann

29 January, 2016 at 09:15 | Posted in Varia | Comments Off on Manfred Mann


Against multiple regression analysis

28 January, 2016 at 18:35 | Posted in Statistics & Econometrics | 2 Comments

Distinguished social psychologist Richard E. Nisbett has a somewhat atypical aversion to multiple regression analysis . In his Intelligence and How to Get It (Norton 2011) he wrote (p. 17):

iqResearchers often determine the individual’s contemporary IQ or IQ earlier in life, socioeconomic status of the family of origin, living circumstances when the individual was a child, number of siblings, whether the family had a library card, educational attainment of the individual, and other variables, and put all of them into a multiple-regression equation predicting adult socioeconomic status or income or social pathology or whatever. Researchers then report the magnitude of the contribution of each of the variables in the regression equation, net of all the others (that is, holding constant all the others). It always turns out that IQ, net of all the other variables, is important to outcomes. But … the independent variables pose a tangle of causality – with some causing others in goodness-knows-what ways and some being caused by unknown variables that have not even been measured. Higher socioeconomic status of parents is related to educational attainment of the child, but higher-socioeconomic-status parents have higher IQs, and this affects both the genes that the child has and the emphasis that the parents are likely to place on education and the quality of the parenting with respect to encouragement of intellectual skills and so on. So statements such as “IQ accounts for X percent of the variation in occupational attainment” are built on the shakiest of statistical foundations. What nature hath joined together, multiple regressions cannot put asunder.

And now he is back with a half an hour lecture — The Crusade Against Multiple Regression Analysis — posted on The Edge website a week ago (watch the lecture here).

Now, I think that what Nisbett says is right as far as it goes, although it would certainly have strengthened Nisbett’s argumentation if he had elaborated more on the methodological question around causality, or at least had given some mathematical-statistical-econometric references. Unfortunately, his alternative approach is not more convincing than regression analysis. As so many other contemporary social scientists today, Nisbett seems to think that randomization may solve the empirical problem. By randomizing we are getting different “populations” that are homogeneous in regards to all variables except the one we think is a genuine cause. In this way we are supposed to be able to not have to actually know what all these other factors are.

If you succeed in performing an ideal randomization with different treatment groups and control groups that is attainable. But it presupposes that you really have been able to establish – and not just assume – that the probability of all other causes but the putative have the same probability distribution in the treatment and control groups, and that the probability of assignment to treatment or control groups are independent of all other possible causal variables.

Unfortunately, real experiments and real randomizations seldom or never achieve this. So, yes, we may do without knowing all causes, but it takes ideal experiments and ideal randomizations to do that, not real ones.

As I have argued — e. g. here — that means that in practice we do have to have sufficient background knowledge to deduce causal knowledge. Without old knowledge, we can’t get new knowledge – and, no causes in, no causes out.

Nisbett is well worth reading and listening to, but on the issue of the shortcomings of multiple regression analysis, no one sums it up better than eminent mathematical statistician David Freedman in his Statistical Models and Causal Inference:

If the assumptions of a model are not derived from theory, and if predictions are not tested against reality, then deductions from the model must be quite shaky. However, without the model, the data cannot be used to answer the research question …

In my view, regression models are not a particularly good way of doing empirical work in the social sciences today, because the technique depends on knowledge that we do not have. Investigators who use the technique are not paying adequate attention to the connection – if any – between the models and the phenomena they are studying. Their conclusions may be valid for the computer code they have created, but the claims are hard to transfer from that microcosm to the larger world …

Regression models often seem to be used to compensate for problems in measurement, data collection, and study design. By the time the models are deployed, the scientific position is nearly hopeless. Reliance on models in such cases is Panglossian …

Given the limits to present knowledge, I doubt that models can be rescued by technical fixes. Arguments about the theoretical merit of regression or the asymptotic behavior of specification tests for picking one version of a model over another seem like the arguments about how to build desalination plants with cold fusion and the energy source. The concept may be admirable, the technical details may be fascinating, but thirsty people should look elsewhere …

Causal inference from observational data presents may difficulties, especially when underlying mechanisms are poorly understood. There is a natural desire to substitute intellectual capital for labor, and an equally natural preference for system and rigor over methods that seem more haphazard. These are possible explanations for the current popularity of statistical models.

Indeed, far-reaching claims have been made for the superiority of a quantitative template that depends on modeling – by those who manage to ignore the far-reaching assumptions behind the models. However, the assumptions often turn out to be unsupported by the data. If so, the rigor of advanced quantitative methods is a matter of appearance rather than substance.

The force from above cleaning my soul

28 January, 2016 at 09:30 | Posted in Varia | Comments Off on The force from above cleaning my soul


Krugman — a Vichy Left coward?

27 January, 2016 at 23:53 | Posted in Politics & Society | Comments Off on Krugman — a Vichy Left coward?

cowardly-lionPaul Krugman’s recent posts have been most peculiar. Several have looked uncomfortably like special pleading for political figures he likes, notably Hillary Clinton. He has, in my judgement, stooped rather far down in attacking people well below him in the public relations food chain …

Perhaps the most egregious and clearest cut case is his refusal to address the substance of a completely legitimate, well-documented article by David Dayen outing Krugman, and to a lesser degree, his fellow traveler Mike Konczal, in abjectly misrepresenting Sanders’ financial reform proposals …

The Krugman that was early to stand up to the Iraq War, who was incisive before and during the crisis has been very much in absence since Obama took office. It’s hard to understand the loss of intellectual independence. That may not make Krugman any worse than other Democratic party apparatchiks, but he continues to believe he is other than that, and the lashing out at Dayen looks like a wounded denial of his current role. Krugman and Konczal need to be seen as what they are: part of the Vichy Left brand cover for the Democratic party messaging apparatus. Krugman, sadly, has chosen to diminish himself for a not very worthy cause.

Yves Smith/Naked Capitalism

Thatcher policies for dummies

27 January, 2016 at 15:47 | Posted in Politics & Society | Comments Off on Thatcher policies for dummies



What is Post Keynesian economics?

27 January, 2016 at 11:05 | Posted in Economics | Comments Off on What is Post Keynesian economics?


Deduction — induction — abduction

25 January, 2016 at 14:42 | Posted in Theory of Science & Methodology | 3 Comments



In science – and economics – one could argue that there basically are three kinds of argumentation patterns/schemes/methods/strategies available:


Premise 1: All Chicago economists believe in REH
Premise 2: Robert Lucas is a Chicago economist
Conclusion: Robert Lucas believes in REH

Here we have an example of a logically valid deductive inference (and, following Quine, whenever logic is used in this essay, ‘logic’ refers to deductive/analytical logic).

In a hypothetico-deductive reasoning — hypothetico-deductive confirmation in this case — we would use the conclusion to test the law-like hypothesis in premise 1 (according to the hypothetico-deductive model, a hypothesis is confirmed by evidence if the evidence is deducible from the hypothesis). If Robert Lucas does not believe in REH we have gained some warranted reason for non-acceptance of the hypothesis (an obvious shortcoming here being that further information beyond that given in the explicit premises might have given another conclusion).

The hypothetico-deductive method (in case we treat the hypothesis as absolutely sure/true, we rather talk of an axiomatic-deductive method) basically means that we

•Posit a hypothesis
•Infer empirically testable propositions (consequences) from it
•Test the propositions through observation or experiment
•Depending on the testing results either find the hypothesis corroborated or falsified.

However, in science we regularly use a kind of ‘practical’ argumentation where there is little room for applying the restricted logical ‘formal transformations’ view of validity and inference. Most people would probably accept the following argument as a ‘valid’ reasoning even though it from a strictly logical point of view is non-valid:

Premise 1: Robert Lucas is a Chicago economist
Premise 2: The recorded proportion of Keynesian Chicago economists is zero
Conclusion: So, certainly, Robert Lucas is not a Keynesian economist

How come? Well I guess one reason is that in science, contrary to what you find in most logic text-books, not very many argumentations are settled by showing that ‘All Xs are Ys.’ In scientific practice we instead present other-than-analytical explicit warrants and backings — data, experience, evidence, theories, models — for our inferences. As long as we can show that our ‘deductions’ or ‘inferences’ are justifiable and have well-backed warrants our colleagues listen to us. That our scientific ‘deductions’ or ‘inferences’ are logical non-entailments simply is not a problem. To think otherwise is committing the fallacy of misapplying formal-analytical logic categories to areas where they are pretty much irrelevant or simply beside the point.

Scientific arguments are not analytical arguments, where validity is solely a question of formal properties. Scientific arguments are substantial arguments. If Robert Lucas is a Keynesian or not, is nothing we can decide on formal properties of statements/propositions. We have to check out what the guy has actually been writing and saying to check if the hypothesis that he is a Keynesian is true or not.

In a deductive-nomological explanation — also known as a covering law explanation — we would try to explain why Robert Lucas believes in REH with the help of the two premises (in this case actually giving an explanation with very little explanatory value). These kinds of explanations — both in their deterministic and statistic/probabilistic versions — rely heavily on deductive entailment from assumed to be true premises. But they have preciously little to say on where these assumed to be true premises come from.

Deductive logic of confirmation and explanation may work well — given that they are used in deterministic closed models! In mathematics, the deductive-axiomatic method has worked just fine. But science is not mathematics. Conflating those two domains of knowledge has been one of the most fundamental mistakes made in the science of economics.  Applying it to real world systems, however, immediately proves it to be excessively narrow and hopelessly irrelevant. Both the confirmatory and explanatory ilk of hypothetico-deductive reasoning fails since there is no way you can relevantly analyze confirmation or explanation as a purely logical relation between hypothesis and evidence or between law-like rules and explananda. In science we argue and try to substantiate our beliefs and hypotheses with reliable evidence — proportional and predicate deductive logic, on the other hand, is not about reliability, but the validity of the conclusions given that the premises are true.

Deduction — and the inferences that goes with it — is an example of ‘explicative reasoning,’  where the conclusions we make are already included in the premises. Deductive inferences are purely analytical and it is this truth-preserving nature of deduction that makes it different from all other kinds of reasoning. But it is also its limitation, since truth in the deductive context does not refer to  a real world ontology (only relating propositions as true or false within a formal-logic system) and as an argument scheme is totally non-ampliative — the output of the analysis is nothing else than the input.
Continue Reading Deduction — induction — abduction…

‘New Keynesian’ DSGE models

24 January, 2016 at 10:28 | Posted in Economics | 2 Comments

In the model [Gali, Smets and Wouters, Unemployment in an Estimated New Keyesian Model (2011)] there is perfect consumption insurance among the members of the household. SR002_FRONTBecause of separability in utility, this implies that consumption is equalized across all workers, whether they are employed or not … Workers who find that they do not have to work are unemployed or out of the labor force, and they have cause to rejoice as a result. Unemployed workers enjoy higher utility than the employed because they receive the same level of consumption, but without having to work.

There is much evidence that in practice unemployment is not the happy experience it is for workers in the model.  For example, Chetty and Looney (2006) and Gruber (1997) find that US households suffer roughly a 10 percent drop in consumption when they lose their job. According to Couch and Placzek (2010), workers displaced through mass layoffs suffer substantial and extended reductions in earnings. Moreover, Oreopoulos, Page and Stevens (2008) present evidence that the children of displaced workers also suffer reduced earnings. Additional evidence that unemployed workers suffer a reduction in utility include the results of direct interviews, as well as findings that unemployed workers experience poor health outcomes. Clark and Oswald (1994), Oswald (1997) and Schimmack, Schupp and Wagner (2008) describe evidence that suggests unemployment has a negative impact on a worker’s self-assessment of well being. Sullivan and von Wachter (2009) report that the mortality rates of high-seniority workers jump 50-100% more than would have been expected otherwise in the year after displacement. Cox and Koo (2006) report a significant positive correlation between male suicide and unemployment in Japan and the United States. For additional evidence that unemployment is associated with poor health outcomes, see Fergusson, Horwood and Lynskey (1997) and Karsten and Moser (2009) …

Suppose the CPS [Current Population Survey] employee encountered one of the people designated as “unemployed” … and asked if she were “available for work”. What would her answer be? She knows with certainty that she will not be employed in the current period. Privately, she is delighted about this because the non-employed enjoy higher utility than the employed … Not only is she happy about not having to work, but the labor union also does not want her to work. From the perspective of the union, her non-employment is a fundamental component of the union’s strategy for promoting the welfare of its membership.

Lawrence J. Christiano

fubar1To me these kind of “New Keynesian” DSGE models, where unemployment is portrayed as a bliss, are a sign of a momentous failure to model real-world unemployment. It’s not only adding insult to injury — it’s also sad gibberish that shamelessly tries to whitewash neoliberal economic policies that put people out of work.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But models do not only face theory. They also have to look to the world. Being able to model a “credible” DSGE world — how credible that world is, when depicting unemployment as a “happy experience” and predicting the wage markup to increase with unemployment, I leave to the reader to decide — a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

The final court of appeal for macroeconomic models is the real world — and as long as no convincing justification is put forward for how the inferential bridging from models to reality de facto is made, macroeconomic model building is little more than “hand waving” that gives us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

On the non-neutrality of money

23 January, 2016 at 14:29 | Posted in Economics | 1 Comment

Paul Krugman has repeatedly over the years argued that we should continue to use neoclassical hobby horses like IS-LM and Aggregate Supply-Aggregate Demand models. Here’s one example:

So why do AS-AD? … We do want, somewhere along the way, to get across the notion of the self-correcting economy, the notion that in the long run, we may all be dead, but that we also have a tendency to return to full employment via price flexibility. Or to put it differently, you do want somehow to make clear the notion (which even fairly Keynesian guys like me share) that money is neutral in the long run.

I doubt that Keynes would have been impressed by having his theory being characterized with catchwords like “tendency to return to full employment” and “money is neutral in the long run.”


One of Keynes’s central tenets — in clear contradistinction to the beliefs of neoclassical economists — is that there is no strong automatic tendency for economies to move toward full employment levels in monetary economies.

Money doesn’t matter in neoclassical macroeconomic models. That’s true. But in the real world in which we happen to live, money does certainly matter. Money is not neutral and money matters in both the short run and the long run:

The theory which I desiderate would deal … with an economy in which money plays a part of its own and affects motives and decisions, and is, in short, one of the operative factors in the situation, so that the course of events cannot be predicted in either the long period or in the short, without a knowledge of the behaviour of money between the first state and the last. And it is this which we ought to mean when we speak of a monetary economy.

J. M. Keynes A monetary theory of production (1933)

Additivity — a dangerous assumption

21 January, 2016 at 20:28 | Posted in Economics | 1 Comment

2014+22keynes%20illo2The unpopularity of the principle of organic unities shows very clearly how great is the danger of the assumption of unproved additive formulas. The fallacy, of which ignorance of organic unity is a particular instance, may perhaps be mathematically represented thus: suppose f(x) is the goodness of x and f(y) is the goodness of y. It is then assumed that the goodness of x and y together is f(x) + f(y) when it is clearly f(x + y) and only in special cases will it be true that f(x + y) = f(x) + f(y). It is plain that it is never legitimate to assume this property in the case of any given function without proof.

J. M. Keynes “Ethics in Relation to Conduct” (1903) 

Since econometrics doesn’t content itself with only making optimal predictions, but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — most important of these are additivity and linearity. Important, simply because if they are not true, your model is invalid and descriptively incorrect. It’s like calling your house a bicycle. No matter how you try, it won’t move you an inch. When the model is wrong — well, then it’s wrong.

Fadime Sahindal

21 January, 2016 at 09:02 | Posted in Politics & Society | Comments Off on Fadime Sahindal
Till Fadime Sahindal, född 2 april 1975 i Turkiet, mördad 21 januari 2002 i Sverige

fadimeI Sverige har vi länge okritiskt omhuldat en ospecificerad och odefinierad mångkulturalism. Om vi med mångkulturalism menar att det med kulturell tillhörighet och identitet också kommer specifika moraliska, etiska och politiska rättigheter och skyldigheter, talar vi om normativ multikulturalism. Att acceptera normativ mångkulturalism, innebär att också tolerera oacceptabel intolerans, eftersom den normativa mångkulturalismen innebär att specifika kulturella gruppers rättigheter kan komma att ges högre dignitet än samhällsmedborgarens allmänmänskliga rättigheter – och därigenom indirekt blir till försvar för dessa gruppers intolerans.

Den normativa mångkulturalismen innebär att individer på ett oacceptabelt sätt reduceras till att vara passiva medlemmar av kultur- eller identitetsbärande grupper. Men de som i vårt samhälle visar att de inte respekterar andra människors rättigheter, kan inte räkna med att vi ska vara toleranta mot dem.

Mot dem som i vårt samhälle vill tvinga andra att leva efter deras egna religiösa, kulturella eller ideologiska trosföreställningar och tabun, ska samhället vara intolerant. Mot dem som vill tvinga samhället att anpassa lagar och regler till den egna religionens, kulturens eller gruppens tolkningar, ska samhället vara intolerant.


De döda skall icke tiga men tala.
Förskingrad plåga skall finna sin röst,
och när cellernas råttor och mördarnas kolvar
förvandlats till aska och urgammalt stoft
skall kometens parabel och stjärnornas vågspel
ännu vittna om dessa som föll mot sin mur:
tvagna i eld men inte förbrunna till glöd,
förtrampade slagna men utan ett sår på sin kropp,
och ögon som stirrat i fasa skall öppnas i frid,
och de döda skall icke tiga men tala.

Om de döda skall inte tigas men talas.
Fast stympade strypta i maktens cell,
glasartade beledda i cyniska väntrum
där döden har klistrat sin freds propaganda,
skall de vila länge i samvetets montrar.
balsamerade av sanning och tvagna i eld,
och de som redan har stupat skall icke brytas,
och den som tiggde nåd i ett ögonblicks glömska
skall resa sig och vittna om det som inte brytes,
för de döda skall inte tiga men tala.

Nej, de döda skall icke tiga men tala.
De som kände triumf på sin nacke skall höja sitt huvud,
och de som kvävdes av rök skall se klart,
de som pinades galna skall flöda som källor,
de som föll för sin motsats skall själva fälla,
de som dräptes med bly skall dräpa med eld,
de som vräktes av vågor skall själva bli storm.
Och de döda skall icke tiga men tala.

                                           Erik Lindegren


20 January, 2016 at 17:27 | Posted in Varia | Comments Off on Sacrifice


‘Deep parameters’ and microfoundations

20 January, 2016 at 10:31 | Posted in Economics | Comments Off on ‘Deep parameters’ and microfoundations

In a post last week, Simon Wren-Lewis was discussing if modern academic macroeconomics is eclectic or not. When it comes to methodology it seems as though his conclusion is that it is not:

The New Classical Counter Revolution of the 1970s and 1980s … was primarily a revolution about methodology, about arguing that all models should be microfounded, and in terms of mainstream macro it was completely successful. It also tried to link this to a revolution about policy, about overthrowing Keynesian economics, and this ultimately failed. But perhaps as a result, methodology and policy get confused. Mainstream academic macro is very eclectic in the range of policy questions it can address, and conclusions it can arrive at, but in terms of methodology it is quite the opposite.

In an earlier post he elaborated on why the New Classical Counterrevolution was so successful in replacing older theories, despite the fact that the New Classical models weren’t able to explain what happened to output and inflation in the 1970s and 1980s:

The new theoretical ideas New Classical economists brought to the table were impressive, particularly to those just schooled in graduate micro. Rational expectations is the clearest example …

However, once the basics of New Keynesian theory had been established, it was quite possible to incorporate concepts like rational expectations or Ricardian Eqivalence into a traditional structural econometric model (SEM) …

The real problem with any attempt at synthesis is that a SEM is always going to be vulnerable to the key criticism in Lucas and Sargent, 1979: without a completely consistent microfounded theoretical base, there was the near certainty of inconsistency brought about by inappropriate identification restrictions …

So why does this matter? … If mainstream academic macroeconomists were seduced by anything, it was a methodology – a way of doing the subject which appeared closer to what at least some of their microeconomic colleagues were doing at the time, and which was very different to the methodology of macroeconomics before the New Classical Counterrevolution. The old methodology was eclectic and messy, juggling the competing claims of data and theory. The new methodology was rigorous!

Wren-Lewis seems to be impressed by the ‘rigour’ brought to macroeconomics by the New Classical counterrevolution and its rational expectations, microfoundations and ‘Lucas Critique’.

I fail to see why.

3634flimWren-Lewis’s ‘portrayal’ of rational expectations is not as innocent as it may look. Rational expectations in the neoclassical economists’s world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents ‘make predictable errors’ in the New Keynesian models doesn’t take us a closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

The predominant strategy in mainstream macroeconomics today is to build models and make things happen in these ‘analogue-economy models.’ But although macro-econometrics may have supplied economists with rigorous replicas of real economies, if the goal of theory is to be able to make accurate forecasts or explain what happens in real economies, this ability to — ad nauseam — construct toy models, does not give much leverage.

‘Rigorous’ and ‘precise’ New Classical models — and that goes for the ‘New Keynesian’ variety too — cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

And — applying a ‘Lucas critique’ on New Classical and ‘New Keynesian’ models, it is obvious that they too fail.

Changing ‘policy rules’ cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy. Lucas hope of being able to model the economy as ‘a FORTRAN program’ and ‘gain some confidence that the component parts of the program are in some sense reliable prior to running it’ therefore seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in the alleged stable deep parameters ‘tastes’ and ‘technology’ shows that if you neglect ontological considerations pertaining to the target system, ultimately reality gets its revenge when at last questions of bridging and exportation of model exercises are laid on the table.


People like Dani Rodrik and Simon Wren-Lewis are proud of having an ever-growing smorgasbord of models to cherry-pick from (as long as, of course, the models do not question the standard modeling strategy) when performing their analyses. The ‘rigorous’ and ‘precise’ deductions made in these closed models, however, are not in any way matched by a similar stringency or precision when it comes to what ought to be the most important stage of any research — making statements and explaining things in real economies. Although almost every mainstream economist holds the view that thought-experimental modeling has to be followed by confronting the models with reality — which is what they indirectly want to predict/explain/understand using their models — they all of a sudden become exceedingly vague and imprecise. It is as if all the intellectual force has been invested in the modeling stage and nothing is left for what really matters — what exactly do these models teach us about real economies.

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about real world economies.

Proving things ‘rigorously’ in mathematical models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Axiomatics — the economics fetish

18 January, 2016 at 20:40 | Posted in Theory of Science & Methodology | 4 Comments

Mainstream — neoclassical — economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

The idea that a good scientific theory must be derived from a formal axiomatic system has little if any foundation in the methodology or history of science. Nevertheless, it has become almost an article of faith in modern economics. I am not aware, but would be interested to know, whether, and if so how widely, this misunderstanding has been propagated in other (purportedly) empirical disciplines. The requirement of the axiomatic method in economics betrays a kind of snobbishness and (I use this word advisedly, see below) pedantry, resulting, it seems, from a misunderstanding of good scientific practice …

This doesn’t mean that trying to achieve a reduction of a higher-level discipline to another, deeper discipline is not a worthy objective, but it certainly does mean that one cannot just dismiss, out of hand, a discipline simply because all of its propositions are not deducible from some set of fundamental propositions. Insisting on reduction as a prerequisite for scientific legitimacy is not a scientific attitude; it is merely a form of obscurantism …

theory of vlueThe fetish for axiomitization in economics can largely be traced to Gerard Debreu’s great work, The Theory of Value: An Axiomatic Analysis of Economic Equilibrium … The subsequent work was then brilliantly summarized and extended in another great work, General Competitive Analysis by Arrow and Frank Hahn. Unfortunately, those two books, paragons of the axiomatic method, set a bad example for the future development of economic theory, which embarked on a needless and counterproductive quest for increasing logical rigor instead of empirical relevance …

I think that it is important to understand that there is simply no scientific justification for the highly formalistic manner in which much modern economics is now carried out. Of course, other far more authoritative critics than I, like Mark Blaug and Richard Lipsey have complained about the insistence of modern macroeconomics on microfounded, axiomatized models regardless of whether those models generate better predictions than competing models. Their complaints have regrettably been ignored for the most part. I simply want to point out that a recent, and in many ways admirable, introduction to modern macroeconomics failed to provide a coherent justification for insisting on axiomatized models. It really wasn’t the author’s fault; a coherent justification doesn’t exist.

David Glasner

It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models — as long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live — is beyond my imagination. Sure, the simplicity that axiomatics and analytical arguments bring to economics is attractive to many economists. But …

aSimplicity, however, has its perils. It is one thing to choose as one’s first object of theoretical study the type of arguments open to analysis in the simplest terms. But it is quite another to treat this type of argument as a paradigm and to demand that arguments in other fields should conform to its standards regardless, or build up from a study of the simplest forms of argument alone a set of categories intended for application to arguments of all sorts: one must at any rate begin by inquiring carefully how far the artificial simplicity of one’s chosen modal results in these logical categories also being artificially simple. The sorts of risks one runs otherwise are obvious enough. Distinctions which all happen to cut along the same line for the simplest arguments may need to be handled quite separately in the general case; if we forget this, and our new found logical categories yield paradoxical results when applied to more complex arguments, we may be tempted to put these rules down to defects in the arguments instead of in our categories; and we may end up by thinking that, for some regrettable reason hidden deep in the nature of things, only our original, peculiarly simple arguments are capable of attaining to the ideal of validity.

The gussied up economics of Tweedledum and Tweedledee

17 January, 2016 at 19:09 | Posted in Economics | 2 Comments

“Of course, there were exceptions to these trends: a few economists challenged the assumption of rational behavior, questioned the belief that financial markets can be trusted and pointed to the long history of financial crises that had devastating economic consequences. But they were swimming against the tide, unable to make much headway against a pervasive and, in retrospect, foolish complacency.” —Paul Krugman, New York Times Magazine, September 6, 2009

While normal ecclesiastic practice places this word at the end of the prayer, on this occasion it seems right to put it up front. In two sentences, Professor Paul Krugman … has summed up the failure of an entire era in economic thought, practice, and policy discussion.

And yet, there is something odd about the role of this short paragraph in an essay of over 6,500 words. It’s a throwaway. It leads nowhere …


Krugman’s entire essay is about two groups, both deeply entrenched at (what
they believe to be) the top of academic economics. Both are deeply preoccupied
with their status and with a struggle for influence and for academic power and
prestige—against the other group. Krugman calls them “saltwater” and “freshwater” economists; they tend to call themselves “new classicals” and the “new Keynesians” — although one is not classical and the other is not Keynesian. One might speak of a “Chicago School” and an “MIT School”—after the graduate programs through which so many passed. In truth, there are no precise labels, because the differences between them are both secondary and obscure.

The two groups share a common perspective, a preference for thinking along similar lines. Krugman describes this well, as a “desire for an all-encompassing, intellectually elegant approach that also gave economists a chance to show off their mathematical prowess.” Exactly so. It was in part about elegance — and in part about showing off. It was not about … the economy. It was not a discussion of problems, risks, dangers, and policies. In consequence, the failure was shared by both groups. This is the extraordinary thing. Economics was not riven by a feud between Pangloss and Cassandra. It was all a chummy conversation between Tweedledum and Tweedledee. And if you didn’t think either Tweedle was worth much — well then, you weren’t really an economist, were you?

Professor Krugman contends that Tweedledum and Tweedledee “mistook beauty for truth.” The beauty in question was the “vision of capitalism as a perfect or nearly perfect system.” To be sure, the accusation that a scientist — let alone an entire science — was seduced by beauty over truth is fairly damaging. But it’s worth asking, what exactly was beautiful about this idea? Krugman doesn’t quite say. He does note that the mathematics used to describe the alleged perfection was “impressive-looking” — ”gussied up” as he says, “with fancy equations.” It’s a telling choice of words. “Impressive-looking”? “Gussied up”? These are not terms normally used to describe the Venus de Milo.

James K. Galbraith

Next Page »

Blog at
Entries and comments feeds.