Keynes vs. Samuelson on models

31 January, 2016 at 14:44 | Posted in Economics | 1 Comment

411WDSW5BRL._SX331_BO1,204,203,200_To his credit Keynes was not, in contrast to Samuelson, a formalist who was committed to mathematical economics. Keynes wanted models, but for him, building them required ‘ a vigilant observation of the actual working of our system.’ Indeed, ‘to convert a model into a quantitative formula is to destroy its usefulness as an instrument of thought.’ That conclusion can be strongly endorsed!
 

Advertisements

Economics — still in the land of Mordor

30 January, 2016 at 12:00 | Posted in Economics | Comments Off on Economics — still in the land of Mordor

When it comes to my economics training, I’m a late bloomer. My primary training is in evolutionary theory, which I have used as a navigational guide to study many human-related topics, such as religion. But I didn’t tackle economics until 2008 …

At the time I had no way to answer this question. Economic jargon mystified me—an embarrassing confession, since I am fully at home with mathematical and computer simulation models. Economists were very smart, very powerful, and they spoke a language that I didn’t understand. They won Nobel Prizes.

Nevertheless, I had faith that evolution could say something important about the regulatory systems that economists preside over, even if I did not yet know the details …

Fortunately, I had a Fellowship of the Ring to rely upon … Some of my closest colleagues are highly respected economists, Herbert Gintis, Samuel Bowles, and Ernst Fehr …

I already knew from their work that the main body of modern economics, called neoclassical economics, was being challenged by a new school of thought called experimental and behavioural economics …

63139459I was disappointed. My colleagues such as Herb, Sam, and Ernst confirmed my own impression: They appreciated the relevance of evolution but were a tiny minority among behavioral and experimental economists, who in turn were a tiny minority among neoclassical economists …

The more I learned about economics, the more I discovered a landscape that is surpassingly strange. Like the land of Mordor, it is dominated by a single theoretical edifice that arose like a volcano early in the 20th century and still dominates the landscape. The edifice is based upon a conception of human nature that is profoundly false, defying the dictates of common sense, before we even get to the more refined dictates of psychology and evolutionary theory. Yet, efforts to move the theory in the direction of common sense are stubbornly resisted.

David Sloan Wilson

[h/t Tom Hickey]

Good advice

30 January, 2016 at 11:31 | Posted in Varia | Comments Off on Good advice

‘If you really want something, you have to be prepared to work very hard, take advantage of opportunity, and above all — never give up.’
 

[h/t Ulrika Hall]

At the age of thirty-seven

29 January, 2016 at 21:34 | Posted in Varia | Comments Off on At the age of thirty-seven

 

Still absolutely breathtakingly great!

LOGIC of science vs. METHODS of science

29 January, 2016 at 17:19 | Posted in Theory of Science & Methodology | Comments Off on LOGIC of science vs. METHODS of science

 

Manfred Mann

29 January, 2016 at 09:15 | Posted in Varia | Comments Off on Manfred Mann

 

Against multiple regression analysis

28 January, 2016 at 18:35 | Posted in Statistics & Econometrics | 2 Comments

Distinguished social psychologist Richard E. Nisbett has a somewhat atypical aversion to multiple regression analysis . In his Intelligence and How to Get It (Norton 2011) he wrote (p. 17):

iqResearchers often determine the individual’s contemporary IQ or IQ earlier in life, socioeconomic status of the family of origin, living circumstances when the individual was a child, number of siblings, whether the family had a library card, educational attainment of the individual, and other variables, and put all of them into a multiple-regression equation predicting adult socioeconomic status or income or social pathology or whatever. Researchers then report the magnitude of the contribution of each of the variables in the regression equation, net of all the others (that is, holding constant all the others). It always turns out that IQ, net of all the other variables, is important to outcomes. But … the independent variables pose a tangle of causality – with some causing others in goodness-knows-what ways and some being caused by unknown variables that have not even been measured. Higher socioeconomic status of parents is related to educational attainment of the child, but higher-socioeconomic-status parents have higher IQs, and this affects both the genes that the child has and the emphasis that the parents are likely to place on education and the quality of the parenting with respect to encouragement of intellectual skills and so on. So statements such as “IQ accounts for X percent of the variation in occupational attainment” are built on the shakiest of statistical foundations. What nature hath joined together, multiple regressions cannot put asunder.

And now he is back with a half an hour lecture — The Crusade Against Multiple Regression Analysis — posted on The Edge website a week ago (watch the lecture here).

Now, I think that what Nisbett says is right as far as it goes, although it would certainly have strengthened Nisbett’s argumentation if he had elaborated more on the methodological question around causality, or at least had given some mathematical-statistical-econometric references. Unfortunately, his alternative approach is not more convincing than regression analysis. As so many other contemporary social scientists today, Nisbett seems to think that randomization may solve the empirical problem. By randomizing we are getting different “populations” that are homogeneous in regards to all variables except the one we think is a genuine cause. In this way we are supposed to be able to not have to actually know what all these other factors are.

If you succeed in performing an ideal randomization with different treatment groups and control groups that is attainable. But it presupposes that you really have been able to establish – and not just assume – that the probability of all other causes but the putative have the same probability distribution in the treatment and control groups, and that the probability of assignment to treatment or control groups are independent of all other possible causal variables.

Unfortunately, real experiments and real randomizations seldom or never achieve this. So, yes, we may do without knowing all causes, but it takes ideal experiments and ideal randomizations to do that, not real ones.

As I have argued — e. g. here — that means that in practice we do have to have sufficient background knowledge to deduce causal knowledge. Without old knowledge, we can’t get new knowledge – and, no causes in, no causes out.

Nisbett is well worth reading and listening to, but on the issue of the shortcomings of multiple regression analysis, no one sums it up better than eminent mathematical statistician David Freedman in his Statistical Models and Causal Inference:

If the assumptions of a model are not derived from theory, and if predictions are not tested against reality, then deductions from the model must be quite shaky. However, without the model, the data cannot be used to answer the research question …

In my view, regression models are not a particularly good way of doing empirical work in the social sciences today, because the technique depends on knowledge that we do not have. Investigators who use the technique are not paying adequate attention to the connection – if any – between the models and the phenomena they are studying. Their conclusions may be valid for the computer code they have created, but the claims are hard to transfer from that microcosm to the larger world …

Regression models often seem to be used to compensate for problems in measurement, data collection, and study design. By the time the models are deployed, the scientific position is nearly hopeless. Reliance on models in such cases is Panglossian …

Given the limits to present knowledge, I doubt that models can be rescued by technical fixes. Arguments about the theoretical merit of regression or the asymptotic behavior of specification tests for picking one version of a model over another seem like the arguments about how to build desalination plants with cold fusion and the energy source. The concept may be admirable, the technical details may be fascinating, but thirsty people should look elsewhere …

Causal inference from observational data presents may difficulties, especially when underlying mechanisms are poorly understood. There is a natural desire to substitute intellectual capital for labor, and an equally natural preference for system and rigor over methods that seem more haphazard. These are possible explanations for the current popularity of statistical models.

Indeed, far-reaching claims have been made for the superiority of a quantitative template that depends on modeling – by those who manage to ignore the far-reaching assumptions behind the models. However, the assumptions often turn out to be unsupported by the data. If so, the rigor of advanced quantitative methods is a matter of appearance rather than substance.

The force from above cleaning my soul

28 January, 2016 at 09:30 | Posted in Varia | Comments Off on The force from above cleaning my soul

 

Krugman — a Vichy Left coward?

27 January, 2016 at 23:53 | Posted in Politics & Society | Comments Off on Krugman — a Vichy Left coward?

cowardly-lionPaul Krugman’s recent posts have been most peculiar. Several have looked uncomfortably like special pleading for political figures he likes, notably Hillary Clinton. He has, in my judgement, stooped rather far down in attacking people well below him in the public relations food chain …

Perhaps the most egregious and clearest cut case is his refusal to address the substance of a completely legitimate, well-documented article by David Dayen outing Krugman, and to a lesser degree, his fellow traveler Mike Konczal, in abjectly misrepresenting Sanders’ financial reform proposals …

The Krugman that was early to stand up to the Iraq War, who was incisive before and during the crisis has been very much in absence since Obama took office. It’s hard to understand the loss of intellectual independence. That may not make Krugman any worse than other Democratic party apparatchiks, but he continues to believe he is other than that, and the lashing out at Dayen looks like a wounded denial of his current role. Krugman and Konczal need to be seen as what they are: part of the Vichy Left brand cover for the Democratic party messaging apparatus. Krugman, sadly, has chosen to diminish himself for a not very worthy cause.

Yves Smith/Naked Capitalism

Thatcher policies for dummies

27 January, 2016 at 15:47 | Posted in Politics & Society | Comments Off on Thatcher policies for dummies

CEl6MKTWMAETZxT

thatrcheronedirection

What is Post Keynesian economics?

27 January, 2016 at 11:05 | Posted in Economics | Comments Off on What is Post Keynesian economics?

 

Deduction — induction — abduction

25 January, 2016 at 14:42 | Posted in Theory of Science & Methodology | 3 Comments

 

mcgregor4_clip_image002_0000

In science – and economics – one could argue that there basically are three kinds of argumentation patterns/schemes/methods/strategies available:

Deduction

Premise 1: All Chicago economists believe in REH
Premise 2: Robert Lucas is a Chicago economist
—————————————————————–
Conclusion: Robert Lucas believes in REH

Here we have an example of a logically valid deductive inference (and, following Quine, whenever logic is used in this essay, ‘logic’ refers to deductive/analytical logic).

In a hypothetico-deductive reasoning — hypothetico-deductive confirmation in this case — we would use the conclusion to test the law-like hypothesis in premise 1 (according to the hypothetico-deductive model, a hypothesis is confirmed by evidence if the evidence is deducible from the hypothesis). If Robert Lucas does not believe in REH we have gained some warranted reason for non-acceptance of the hypothesis (an obvious shortcoming here being that further information beyond that given in the explicit premises might have given another conclusion).

The hypothetico-deductive method (in case we treat the hypothesis as absolutely sure/true, we rather talk of an axiomatic-deductive method) basically means that we

•Posit a hypothesis
•Infer empirically testable propositions (consequences) from it
•Test the propositions through observation or experiment
•Depending on the testing results either find the hypothesis corroborated or falsified.

However, in science we regularly use a kind of ‘practical’ argumentation where there is little room for applying the restricted logical ‘formal transformations’ view of validity and inference. Most people would probably accept the following argument as a ‘valid’ reasoning even though it from a strictly logical point of view is non-valid:

Premise 1: Robert Lucas is a Chicago economist
Premise 2: The recorded proportion of Keynesian Chicago economists is zero
————————————————————————–
Conclusion: So, certainly, Robert Lucas is not a Keynesian economist

How come? Well I guess one reason is that in science, contrary to what you find in most logic text-books, not very many argumentations are settled by showing that ‘All Xs are Ys.’ In scientific practice we instead present other-than-analytical explicit warrants and backings — data, experience, evidence, theories, models — for our inferences. As long as we can show that our ‘deductions’ or ‘inferences’ are justifiable and have well-backed warrants our colleagues listen to us. That our scientific ‘deductions’ or ‘inferences’ are logical non-entailments simply is not a problem. To think otherwise is committing the fallacy of misapplying formal-analytical logic categories to areas where they are pretty much irrelevant or simply beside the point.

Scientific arguments are not analytical arguments, where validity is solely a question of formal properties. Scientific arguments are substantial arguments. If Robert Lucas is a Keynesian or not, is nothing we can decide on formal properties of statements/propositions. We have to check out what the guy has actually been writing and saying to check if the hypothesis that he is a Keynesian is true or not.

In a deductive-nomological explanation — also known as a covering law explanation — we would try to explain why Robert Lucas believes in REH with the help of the two premises (in this case actually giving an explanation with very little explanatory value). These kinds of explanations — both in their deterministic and statistic/probabilistic versions — rely heavily on deductive entailment from assumed to be true premises. But they have preciously little to say on where these assumed to be true premises come from.

Deductive logic of confirmation and explanation may work well — given that they are used in deterministic closed models! In mathematics, the deductive-axiomatic method has worked just fine. But science is not mathematics. Conflating those two domains of knowledge has been one of the most fundamental mistakes made in the science of economics.  Applying it to real world systems, however, immediately proves it to be excessively narrow and hopelessly irrelevant. Both the confirmatory and explanatory ilk of hypothetico-deductive reasoning fails since there is no way you can relevantly analyze confirmation or explanation as a purely logical relation between hypothesis and evidence or between law-like rules and explananda. In science we argue and try to substantiate our beliefs and hypotheses with reliable evidence — proportional and predicate deductive logic, on the other hand, is not about reliability, but the validity of the conclusions given that the premises are true.

Deduction — and the inferences that goes with it — is an example of ‘explicative reasoning,’  where the conclusions we make are already included in the premises. Deductive inferences are purely analytical and it is this truth-preserving nature of deduction that makes it different from all other kinds of reasoning. But it is also its limitation, since truth in the deductive context does not refer to  a real world ontology (only relating propositions as true or false within a formal-logic system) and as an argument scheme is totally non-ampliative — the output of the analysis is nothing else than the input.
Continue Reading Deduction — induction — abduction…

‘New Keynesian’ DSGE models

24 January, 2016 at 10:28 | Posted in Economics | 2 Comments

In the model [Gali, Smets and Wouters, Unemployment in an Estimated New Keyesian Model (2011)] there is perfect consumption insurance among the members of the household. SR002_FRONTBecause of separability in utility, this implies that consumption is equalized across all workers, whether they are employed or not … Workers who find that they do not have to work are unemployed or out of the labor force, and they have cause to rejoice as a result. Unemployed workers enjoy higher utility than the employed because they receive the same level of consumption, but without having to work.

There is much evidence that in practice unemployment is not the happy experience it is for workers in the model.  For example, Chetty and Looney (2006) and Gruber (1997) find that US households suffer roughly a 10 percent drop in consumption when they lose their job. According to Couch and Placzek (2010), workers displaced through mass layoffs suffer substantial and extended reductions in earnings. Moreover, Oreopoulos, Page and Stevens (2008) present evidence that the children of displaced workers also suffer reduced earnings. Additional evidence that unemployed workers suffer a reduction in utility include the results of direct interviews, as well as findings that unemployed workers experience poor health outcomes. Clark and Oswald (1994), Oswald (1997) and Schimmack, Schupp and Wagner (2008) describe evidence that suggests unemployment has a negative impact on a worker’s self-assessment of well being. Sullivan and von Wachter (2009) report that the mortality rates of high-seniority workers jump 50-100% more than would have been expected otherwise in the year after displacement. Cox and Koo (2006) report a significant positive correlation between male suicide and unemployment in Japan and the United States. For additional evidence that unemployment is associated with poor health outcomes, see Fergusson, Horwood and Lynskey (1997) and Karsten and Moser (2009) …

Suppose the CPS [Current Population Survey] employee encountered one of the people designated as “unemployed” … and asked if she were “available for work”. What would her answer be? She knows with certainty that she will not be employed in the current period. Privately, she is delighted about this because the non-employed enjoy higher utility than the employed … Not only is she happy about not having to work, but the labor union also does not want her to work. From the perspective of the union, her non-employment is a fundamental component of the union’s strategy for promoting the welfare of its membership.

Lawrence J. Christiano

fubar1To me these kind of “New Keynesian” DSGE models, where unemployment is portrayed as a bliss, are a sign of a momentous failure to model real-world unemployment. It’s not only adding insult to injury — it’s also sad gibberish that shamelessly tries to whitewash neoliberal economic policies that put people out of work.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But models do not only face theory. They also have to look to the world. Being able to model a “credible” DSGE world — how credible that world is, when depicting unemployment as a “happy experience” and predicting the wage markup to increase with unemployment, I leave to the reader to decide — a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

The final court of appeal for macroeconomic models is the real world — and as long as no convincing justification is put forward for how the inferential bridging from models to reality de facto is made, macroeconomic model building is little more than “hand waving” that gives us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

On the non-neutrality of money

23 January, 2016 at 14:29 | Posted in Economics | 1 Comment

Paul Krugman has repeatedly over the years argued that we should continue to use neoclassical hobby horses like IS-LM and Aggregate Supply-Aggregate Demand models. Here’s one example:

So why do AS-AD? … We do want, somewhere along the way, to get across the notion of the self-correcting economy, the notion that in the long run, we may all be dead, but that we also have a tendency to return to full employment via price flexibility. Or to put it differently, you do want somehow to make clear the notion (which even fairly Keynesian guys like me share) that money is neutral in the long run.

I doubt that Keynes would have been impressed by having his theory being characterized with catchwords like “tendency to return to full employment” and “money is neutral in the long run.”

alfa

One of Keynes’s central tenets — in clear contradistinction to the beliefs of neoclassical economists — is that there is no strong automatic tendency for economies to move toward full employment levels in monetary economies.

Money doesn’t matter in neoclassical macroeconomic models. That’s true. But in the real world in which we happen to live, money does certainly matter. Money is not neutral and money matters in both the short run and the long run:

The theory which I desiderate would deal … with an economy in which money plays a part of its own and affects motives and decisions, and is, in short, one of the operative factors in the situation, so that the course of events cannot be predicted in either the long period or in the short, without a knowledge of the behaviour of money between the first state and the last. And it is this which we ought to mean when we speak of a monetary economy.

J. M. Keynes A monetary theory of production (1933)

Additivity — a dangerous assumption

21 January, 2016 at 20:28 | Posted in Economics | 1 Comment

2014+22keynes%20illo2The unpopularity of the principle of organic unities shows very clearly how great is the danger of the assumption of unproved additive formulas. The fallacy, of which ignorance of organic unity is a particular instance, may perhaps be mathematically represented thus: suppose f(x) is the goodness of x and f(y) is the goodness of y. It is then assumed that the goodness of x and y together is f(x) + f(y) when it is clearly f(x + y) and only in special cases will it be true that f(x + y) = f(x) + f(y). It is plain that it is never legitimate to assume this property in the case of any given function without proof.

J. M. Keynes “Ethics in Relation to Conduct” (1903) 

Since econometrics doesn’t content itself with only making optimal predictions, but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — most important of these are additivity and linearity. Important, simply because if they are not true, your model is invalid and descriptively incorrect. It’s like calling your house a bicycle. No matter how you try, it won’t move you an inch. When the model is wrong — well, then it’s wrong.

Fadime Sahindal

21 January, 2016 at 09:02 | Posted in Politics & Society | Comments Off on Fadime Sahindal
Till Fadime Sahindal, född 2 april 1975 i Turkiet, mördad 21 januari 2002 i Sverige

fadimeI Sverige har vi länge okritiskt omhuldat en ospecificerad och odefinierad mångkulturalism. Om vi med mångkulturalism menar att det med kulturell tillhörighet och identitet också kommer specifika moraliska, etiska och politiska rättigheter och skyldigheter, talar vi om normativ multikulturalism. Att acceptera normativ mångkulturalism, innebär att också tolerera oacceptabel intolerans, eftersom den normativa mångkulturalismen innebär att specifika kulturella gruppers rättigheter kan komma att ges högre dignitet än samhällsmedborgarens allmänmänskliga rättigheter – och därigenom indirekt blir till försvar för dessa gruppers intolerans.

Den normativa mångkulturalismen innebär att individer på ett oacceptabelt sätt reduceras till att vara passiva medlemmar av kultur- eller identitetsbärande grupper. Men de som i vårt samhälle visar att de inte respekterar andra människors rättigheter, kan inte räkna med att vi ska vara toleranta mot dem.

Mot dem som i vårt samhälle vill tvinga andra att leva efter deras egna religiösa, kulturella eller ideologiska trosföreställningar och tabun, ska samhället vara intolerant. Mot dem som vill tvinga samhället att anpassa lagar och regler till den egna religionens, kulturens eller gruppens tolkningar, ska samhället vara intolerant.


DE DÖDA

De döda skall icke tiga men tala.
Förskingrad plåga skall finna sin röst,
och när cellernas råttor och mördarnas kolvar
förvandlats till aska och urgammalt stoft
skall kometens parabel och stjärnornas vågspel
ännu vittna om dessa som föll mot sin mur:
tvagna i eld men inte förbrunna till glöd,
förtrampade slagna men utan ett sår på sin kropp,
och ögon som stirrat i fasa skall öppnas i frid,
och de döda skall icke tiga men tala.

Om de döda skall inte tigas men talas.
Fast stympade strypta i maktens cell,
glasartade beledda i cyniska väntrum
där döden har klistrat sin freds propaganda,
skall de vila länge i samvetets montrar.
balsamerade av sanning och tvagna i eld,
och de som redan har stupat skall icke brytas,
och den som tiggde nåd i ett ögonblicks glömska
skall resa sig och vittna om det som inte brytes,
för de döda skall inte tiga men tala.

Nej, de döda skall icke tiga men tala.
De som kände triumf på sin nacke skall höja sitt huvud,
och de som kvävdes av rök skall se klart,
de som pinades galna skall flöda som källor,
de som föll för sin motsats skall själva fälla,
de som dräptes med bly skall dräpa med eld,
de som vräktes av vågor skall själva bli storm.
Och de döda skall icke tiga men tala.

                                           Erik Lindegren

Sacrifice

20 January, 2016 at 17:27 | Posted in Varia | Comments Off on Sacrifice

 

‘Deep parameters’ and microfoundations

20 January, 2016 at 10:31 | Posted in Economics | Comments Off on ‘Deep parameters’ and microfoundations

In a post last week, Simon Wren-Lewis was discussing if modern academic macroeconomics is eclectic or not. When it comes to methodology it seems as though his conclusion is that it is not:

The New Classical Counter Revolution of the 1970s and 1980s … was primarily a revolution about methodology, about arguing that all models should be microfounded, and in terms of mainstream macro it was completely successful. It also tried to link this to a revolution about policy, about overthrowing Keynesian economics, and this ultimately failed. But perhaps as a result, methodology and policy get confused. Mainstream academic macro is very eclectic in the range of policy questions it can address, and conclusions it can arrive at, but in terms of methodology it is quite the opposite.

In an earlier post he elaborated on why the New Classical Counterrevolution was so successful in replacing older theories, despite the fact that the New Classical models weren’t able to explain what happened to output and inflation in the 1970s and 1980s:

The new theoretical ideas New Classical economists brought to the table were impressive, particularly to those just schooled in graduate micro. Rational expectations is the clearest example …

However, once the basics of New Keynesian theory had been established, it was quite possible to incorporate concepts like rational expectations or Ricardian Eqivalence into a traditional structural econometric model (SEM) …

The real problem with any attempt at synthesis is that a SEM is always going to be vulnerable to the key criticism in Lucas and Sargent, 1979: without a completely consistent microfounded theoretical base, there was the near certainty of inconsistency brought about by inappropriate identification restrictions …

So why does this matter? … If mainstream academic macroeconomists were seduced by anything, it was a methodology – a way of doing the subject which appeared closer to what at least some of their microeconomic colleagues were doing at the time, and which was very different to the methodology of macroeconomics before the New Classical Counterrevolution. The old methodology was eclectic and messy, juggling the competing claims of data and theory. The new methodology was rigorous!

Wren-Lewis seems to be impressed by the ‘rigour’ brought to macroeconomics by the New Classical counterrevolution and its rational expectations, microfoundations and ‘Lucas Critique’.

I fail to see why.

3634flimWren-Lewis’s ‘portrayal’ of rational expectations is not as innocent as it may look. Rational expectations in the neoclassical economists’s world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents ‘make predictable errors’ in the New Keynesian models doesn’t take us a closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

The predominant strategy in mainstream macroeconomics today is to build models and make things happen in these ‘analogue-economy models.’ But although macro-econometrics may have supplied economists with rigorous replicas of real economies, if the goal of theory is to be able to make accurate forecasts or explain what happens in real economies, this ability to — ad nauseam — construct toy models, does not give much leverage.

‘Rigorous’ and ‘precise’ New Classical models — and that goes for the ‘New Keynesian’ variety too — cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

And — applying a ‘Lucas critique’ on New Classical and ‘New Keynesian’ models, it is obvious that they too fail.

Changing ‘policy rules’ cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy. Lucas hope of being able to model the economy as ‘a FORTRAN program’ and ‘gain some confidence that the component parts of the program are in some sense reliable prior to running it’ therefore seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in the alleged stable deep parameters ‘tastes’ and ‘technology’ shows that if you neglect ontological considerations pertaining to the target system, ultimately reality gets its revenge when at last questions of bridging and exportation of model exercises are laid on the table.

keynes-right-and-wrong

People like Dani Rodrik and Simon Wren-Lewis are proud of having an ever-growing smorgasbord of models to cherry-pick from (as long as, of course, the models do not question the standard modeling strategy) when performing their analyses. The ‘rigorous’ and ‘precise’ deductions made in these closed models, however, are not in any way matched by a similar stringency or precision when it comes to what ought to be the most important stage of any research — making statements and explaining things in real economies. Although almost every mainstream economist holds the view that thought-experimental modeling has to be followed by confronting the models with reality — which is what they indirectly want to predict/explain/understand using their models — they all of a sudden become exceedingly vague and imprecise. It is as if all the intellectual force has been invested in the modeling stage and nothing is left for what really matters — what exactly do these models teach us about real economies.

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about real world economies.

Proving things ‘rigorously’ in mathematical models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Axiomatics — the economics fetish

18 January, 2016 at 20:40 | Posted in Theory of Science & Methodology | 4 Comments

Mainstream — neoclassical — economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

The idea that a good scientific theory must be derived from a formal axiomatic system has little if any foundation in the methodology or history of science. Nevertheless, it has become almost an article of faith in modern economics. I am not aware, but would be interested to know, whether, and if so how widely, this misunderstanding has been propagated in other (purportedly) empirical disciplines. The requirement of the axiomatic method in economics betrays a kind of snobbishness and (I use this word advisedly, see below) pedantry, resulting, it seems, from a misunderstanding of good scientific practice …

This doesn’t mean that trying to achieve a reduction of a higher-level discipline to another, deeper discipline is not a worthy objective, but it certainly does mean that one cannot just dismiss, out of hand, a discipline simply because all of its propositions are not deducible from some set of fundamental propositions. Insisting on reduction as a prerequisite for scientific legitimacy is not a scientific attitude; it is merely a form of obscurantism …

theory of vlueThe fetish for axiomitization in economics can largely be traced to Gerard Debreu’s great work, The Theory of Value: An Axiomatic Analysis of Economic Equilibrium … The subsequent work was then brilliantly summarized and extended in another great work, General Competitive Analysis by Arrow and Frank Hahn. Unfortunately, those two books, paragons of the axiomatic method, set a bad example for the future development of economic theory, which embarked on a needless and counterproductive quest for increasing logical rigor instead of empirical relevance …

I think that it is important to understand that there is simply no scientific justification for the highly formalistic manner in which much modern economics is now carried out. Of course, other far more authoritative critics than I, like Mark Blaug and Richard Lipsey have complained about the insistence of modern macroeconomics on microfounded, axiomatized models regardless of whether those models generate better predictions than competing models. Their complaints have regrettably been ignored for the most part. I simply want to point out that a recent, and in many ways admirable, introduction to modern macroeconomics failed to provide a coherent justification for insisting on axiomatized models. It really wasn’t the author’s fault; a coherent justification doesn’t exist.

David Glasner

 
It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models — as long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live — is beyond my imagination. Sure, the simplicity that axiomatics and analytical arguments bring to economics is attractive to many economists. But …

aSimplicity, however, has its perils. It is one thing to choose as one’s first object of theoretical study the type of arguments open to analysis in the simplest terms. But it is quite another to treat this type of argument as a paradigm and to demand that arguments in other fields should conform to its standards regardless, or build up from a study of the simplest forms of argument alone a set of categories intended for application to arguments of all sorts: one must at any rate begin by inquiring carefully how far the artificial simplicity of one’s chosen modal results in these logical categories also being artificially simple. The sorts of risks one runs otherwise are obvious enough. Distinctions which all happen to cut along the same line for the simplest arguments may need to be handled quite separately in the general case; if we forget this, and our new found logical categories yield paradoxical results when applied to more complex arguments, we may be tempted to put these rules down to defects in the arguments instead of in our categories; and we may end up by thinking that, for some regrettable reason hidden deep in the nature of things, only our original, peculiarly simple arguments are capable of attaining to the ideal of validity.

The gussied up economics of Tweedledum and Tweedledee

17 January, 2016 at 19:09 | Posted in Economics | 2 Comments

“Of course, there were exceptions to these trends: a few economists challenged the assumption of rational behavior, questioned the belief that financial markets can be trusted and pointed to the long history of financial crises that had devastating economic consequences. But they were swimming against the tide, unable to make much headway against a pervasive and, in retrospect, foolish complacency.” —Paul Krugman, New York Times Magazine, September 6, 2009

While normal ecclesiastic practice places this word at the end of the prayer, on this occasion it seems right to put it up front. In two sentences, Professor Paul Krugman … has summed up the failure of an entire era in economic thought, practice, and policy discussion.

And yet, there is something odd about the role of this short paragraph in an essay of over 6,500 words. It’s a throwaway. It leads nowhere …

3bc4bd9c320c4bd3450188933cb126a1

Krugman’s entire essay is about two groups, both deeply entrenched at (what
they believe to be) the top of academic economics. Both are deeply preoccupied
with their status and with a struggle for influence and for academic power and
prestige—against the other group. Krugman calls them “saltwater” and “freshwater” economists; they tend to call themselves “new classicals” and the “new Keynesians” — although one is not classical and the other is not Keynesian. One might speak of a “Chicago School” and an “MIT School”—after the graduate programs through which so many passed. In truth, there are no precise labels, because the differences between them are both secondary and obscure.

The two groups share a common perspective, a preference for thinking along similar lines. Krugman describes this well, as a “desire for an all-encompassing, intellectually elegant approach that also gave economists a chance to show off their mathematical prowess.” Exactly so. It was in part about elegance — and in part about showing off. It was not about … the economy. It was not a discussion of problems, risks, dangers, and policies. In consequence, the failure was shared by both groups. This is the extraordinary thing. Economics was not riven by a feud between Pangloss and Cassandra. It was all a chummy conversation between Tweedledum and Tweedledee. And if you didn’t think either Tweedle was worth much — well then, you weren’t really an economist, were you?

Professor Krugman contends that Tweedledum and Tweedledee “mistook beauty for truth.” The beauty in question was the “vision of capitalism as a perfect or nearly perfect system.” To be sure, the accusation that a scientist — let alone an entire science — was seduced by beauty over truth is fairly damaging. But it’s worth asking, what exactly was beautiful about this idea? Krugman doesn’t quite say. He does note that the mathematics used to describe the alleged perfection was “impressive-looking” — ”gussied up” as he says, “with fancy equations.” It’s a telling choice of words. “Impressive-looking”? “Gussied up”? These are not terms normally used to describe the Venus de Milo.

James K. Galbraith

Wren-Lewis and the Rodrik smorgasbord view of economic models

17 January, 2016 at 16:20 | Posted in Economics | 13 Comments

abbIn December 2015 yours truly run a series of eight posts on this blog discussing Dani Rodrik‘s Economics Rules (Oxford University Press, 2015).

There sure is much in the book I like and appreciate. It is one of those rare examples where a mainstream economist — instead of just looking the other way — takes his time to ponder on the tough and deep science-theoretic and methodological questions that underpin the economics discipline.

But (as I argue at large in a forthcoming article in this journal) there is also a very disturbing apologetic tendency in the book to blame all of the shortcomings on the economists and depicting economics itself as a problem-free smorgasbord collection of models. If you just choose the appropriate model from the immense and varied smorgasbord there’s no problem. It is as if all problems in economics were conjured away if only we could make the proper model selection.

Today, Oxford macroeconomist Simon Wren-Lewis has a post up on his blog on Rodrik’s book — and is totally überjoyed:

The first and most important thing to say is this is a great book … because it had a way of putting things which was illuminating and eminently sensible. Illuminating is I think the right word: seeing my own subject in a new light, which is something that has not happened to me for a long time. There was nothing I could think of where I disagreed …

The key idea is that there are many valid models, and the goal is to know when they are applicable to the problem in hand …

Lots of people get hung up on the assumptions behind models: are they true or false, etc. An analogy I had not seen before but which I think is very illuminating is with experiments. Models are like experiments. Experiments are designed to abstract from all kinds of features of the real world, to focus on a particular process or mechanism (or set of the same). The assumptions of models are designed to do the same thing.

“Models are like experiments.” I’ve run into that view many times over the years when having discussions with mainstream economists on their ‘thought experimental’ obsession — and I still think it’s too vague and elusive to be helpful. Just repeating the view doesn’t provide the slightest reasont to believe it.

Although perhaps thought provoking to some, I find the view on experiments offered too simplistic. And for several reasons — but mostly because the kind of experimental empiricism it favours is largely untenable.

Experiments are very similar to theoretical models in many ways  — on that Wren-Lewis and yours truly are in total agreement. Experiments have the same basic problem that they are built on rather artificial conditions and have difficulties with the “trade-off” between internal and external validity. But — with more artificial conditions and internal validity, also comes less external validity. The more we rig experiments/models to avoid the “confounding factors”, the less the conditions are reminiscent of the real “target system”. The nodal issue is how economists using different isolation strategies in different “nomological machines” attempt to learn about causal relationships.

Assume that you have examined how the work performance of Swedish workers A is affected by B (“treatment”). How can we extrapolate/generalize to new samples outside the original population (e.g. to the UK)? How do we know that any replication attempt “succeeds”? How do we know when these replicated experimental results can be said to justify inferences made in samples from the original population? If, for example, P(A|B) is the conditional density function for the original sample, and we are interested in doing a extrapolative prediction of E [P(A|B)], how can we know that the new sample’s density function is identical with the original? Unless we can give some really good argument for this being the case, inferences built on P(A|B) is not really saying anything on that of the target system’s P'(A|B).

As I see it, this is the heart of the matter. External validity/extrapolation/generalization is founded on the assumption that we can make inferences based on P(A|B) that is exportable to other populations for which P'(A|B) applies. Sure, if one can convincingly show that P and P’ are similar enough, the problems are perhaps not insurmountable. But arbitrarily just introducing functional specification restrictions of the type invariance/stability/homogeneity is, at least for an epistemological realist, far from satisfactory. And often it is — unfortunately — exactly this that we see when we take part of neoclassical economists’ models/experiments.

By this I do not mean to say that empirical methods per se are so problematic that they can never be used. On the contrary, I am basically — though not without reservations — in favour of the increased use of experiments within economics as an alternative to completely barren “bridge-less” axiomatic-deductive theory models. My criticism is more about aspiration levels and what we believe we can achieve with our mediational epistemological tools and methods in social sciences.

Just as traditional neoclassical thought-experimental modeling, real experimentation is basically also a deductive method. Given  the assumptions (such as manipulability, transitivity, separability, additivity, linearity etc)  these methods deliver deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right.  Real target systems are seldom epistemically isomorphic to our axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of  the conclusions reached from within these epistemically convenient models/systems.

Limiting model assumptions in economic science always have to be closely examined since if we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we ‘export’ them to our ‘target systems,’ we have to be able to show that they do not only hold under ceteris paribus conditions and a fortiori only are of limited value to our understanding, explanations or predictions of real economic systems.

Wren-Lewis obviously doesn’t want to get ‘hung up on the assumptions behind models.’ Maybe so, but it is still an undeniable fact that theoretical models building on piles of known to be false assumptions in no way even get close to being scientific explanations. On the contrary. They are untestable and a fortiori totally worthless from the point of view of scientific relevance.

Blue In Green

17 January, 2016 at 14:41 | Posted in Economics, Varia | Comments Off on Blue In Green

 

Lennart Schön (1946-2016) In Memoriam

17 January, 2016 at 10:02 | Posted in Varia | Comments Off on Lennart Schön (1946-2016) In Memoriam

Swedish economic historians were deeply saddened to learn of the death earlier this month of professor Lennart Schön.

Schön contributed to many areas of Swedish economic history. Most important was his development of a long-run cycles perspective on Swedish industrial society’s history (on which we worked together in a couple of research projects in the 1990s, carrying further the structural analytical tradition of Johan Åkerman, Erik Dahmén and Ingvar Svennilson).

The greatest Swedish economic historian since Eli Heckscher has passed away.

He is held in great esteem and we all truly miss this open-minded, good-hearted and passionate researcher.

Rest in peace my friend.

Rule of law

16 January, 2016 at 17:30 | Posted in Politics & Society | Comments Off on Rule of law

asa

Den 21 januari är det fjorton år sedan Fadime Sahindal mördades; hennes pappa vägrade låta henne välja sitt liv. Den gången var det laddat att tala om hedersrelaterat våld, om miljöer där släktens rykte hängde på kvinnornas dygd. Tidigare hade kulturella olikheter rentav betraktats som förmildrande omständigheter.

Sådan kulturrelativism är lyckligtvis historia. Men fortfarande lever tiotusentals unga svenskar i miljöer präglade av hederskultur och vittnesmål från förorter avslöjar att självutnämnda väktarråd kringskär kvinnors frihet.

I Afghanistan behandlas många kvinnor illa. I Saudiarabien får kvinnor inte köra bil. I Iran beskär mullorna kvinnors livssfär. Det måste gå att säga att det finns kulturella skillnader i synen på kvinnor och sexualitet. Det måste gå att säga att många pojkar och män som vuxit upp i patriarkala kulturer bär detta med sig – också till Sverige. Att bara väsa “rasist” gör det hopplöst att komma åt problemen.

Kvinnor och män har samma värde. Alla som lever i Sverige måste respektera detta.

Heidi Avellan/Sydsvenskan

Sverige ska vara ett öppet land. En del av världssamfundet.

Men det ska också vara ett land som slår fast att de landvinningar i termer av jämlikhet, öppenhet och tolerans som vi tillkämpat oss under sekler inte är förhandlingsbara.

Människor som kommer till vårt land ska åtnjuta dessa rättigheter och friheter.

Men med dessa rättigheter och friheter kommer också en skyldighet. Alla — utan undantag — måste också acceptera att i vårt land gäller en lag — lika för alla.

Rule of law.

I Köln, Hamburg och Stockholm utmanas stadens och statens rättsordning av grupper som, oavsett deras skiftande etnicitet, förefaller omfatta just stamtänkandets normer. I vetskap om att man inte riskerar stamkulturens vedergällning och i avsaknad av lojalitet mot det omgivande samhället ser man sig fri att behandla oskyddade kvinnor som byten. Det mest bekymmersamma med denna utveckling är kanske den liberala medborgargemenskapens undfallenhet. En långtgående kulturrelativism har medfört en sorts förvärvad stupiditet, som gör att man hellre söker förtiga kulturrelaterade problem och låtsas som om de inte finns än att åtgärda dem. Alternativt skuldbelägger man sig själv, för att slippa ta i den besvärliga konflikten med Den Andre.

Per Bauhn

When will Krugman catch up with Keynes?

16 January, 2016 at 10:44 | Posted in Economics | 2 Comments

In his column this morning, Paul Krugman, had this to say about issues that have mattered a lot to me over many years now. I admire Krugman, of course, but this is bullshit, pure and simple. Not the Harry Frankfurt kind, which requires willful ignorance of the facts, but the everyday kind, which requires mere ignorance of the historical record.

“Don’t say that redistribution is inherently wrong. Even if high incomes perfectly reflected productivity, market outcomes aren’t the same as moral justification. And given the reality that wealth often reflects either luck or power, there’s a strong case to be made for collecting some of that wealth in taxes and using it to make society as a whole stronger, as long as it doesn’t destroy the incentive to keep creating more wealth.”

The “incentive to keep creating more wealth”? …

John-Maynard-Keynes-capitalism-quoteKeynes, bless his heart, called it “a somewhat disgusting morbidity.”

He was right … Here’s how he put it [in ‘Economic Possibilities for Our Grandchildren’ (1930)]:

“When the accumulation of wealth is no longer of high social importance, there will be a great change in the code of morals. We shall be able to rid ourselves of many of the pseudo-moral principles which have hag-ridden us for two hundred years, by which we have exalted some of the most distasteful of human qualities into the position of the highest virtues. We shall be able to afford to dare to assess the money-motive at its true value. The love of money as a possession … will be recognised for what it is, a somewhat disgusting morbidity, one of those semi-criminal, semi-pathological propensities which one hands over with a shudder to the specialist in mental disease.” …

Paul Krugman is a hero to many of us because he fights the good fight against the idiocies of economic theory and practice in our time. But he is now behind the times because he hasn’t yet caught up with Keynes.

James Livingston

Wren-Lewis on macroeconomic eclecticism

13 January, 2016 at 20:39 | Posted in Economics | 8 Comments

Oxford macroeconomist Simon Wren-Lewis has a post up today on his blog discussing whether mainstream macroeconomics is eclectic or not. His answer is — both yes and no:

Does this mean academic macroeconomics is fragmented into lots of cliques, some big and some small? Not really … This is because these models (unlike those of 40+ years ago) use a common language …

quote-eclecticism-every-truth-is-so-true-that-any-truth-must-be-false-f-h-bradley-212906

It means that the range of assumptions that models (DSGE models if you like) can make is huge. There is nothing formally that says every model must contain perfectly competitive labour markets where the simple marginal product theory of distribution holds, or even where there is no involuntary unemployment, as some heterodox economists sometimes assert. Most of the time individuals in these models are optimising, but I know of papers in the top journals that incorporate some non-optimising agents into DSGE models. So there is no reason in principle why behavioural economics could not be incorporated …

It also means that the range of issues that models (DSGE models) can address is also huge …

Mainstream academic macro is very eclectic in the range of policy questions it can address, and conclusions it can arrive at, but in terms of methodology it is quite the opposite.

Wren-Lewis tries to give a picture of modern macroeconomics as a pluralist enterprise. But the change and diversity that gets Wren-Lewis approval only takes place within the analytic-formalistic modeling strategy that makes up the core of mainstream economics. You’re free to take your analytical formalist models and apply it to whatever you want — as long as you do it with a modeling methodology that is acceptable to the mainstream. If you do not follow this particular mathematical-deductive analytical formalism you’re not even considered doing economics. If you haven’t modeled your thoughts, you’re not in the economics business. But this isn’t pluralism. It’s a methodological reductionist straightjacket.

To most mainstream economists you only have knowledge of something when you can prove it, and so ‘proving’ theories with their models via deductions is considered the only certain way to acquire new knowledge. This is, however, a view for which there is no warranted epistemological foundation. Outside mathematics and logics, all human knowledge is conjectural and fallible.

Validly deducing things in closed analytical-formalist-mathematical models — built on atomistic-reductionist assumptions — doesn’t much help us understand or explain what is taking place in the real world we happen to live in. Validly deducing things from patently unreal assumptions — that we all know are purely fictional — makes most of the modeling exercises pursued by mainstream macroeconomists rather pointless. It’s simply not the stuff that real understanding and explanation in science is made of. Had mainstream economists like Wren-Lewis not been so in love with their smorgasbord of models, they would have perceived this too. Telling us that the plethora of models that make up modern macroeconomics  ‘are not right or wrong,’ but ‘just more or less applicable to different situations,’ is nothing short of hand waving.

So, yes, there is a proliferation of macromodels nowadays — but it almost exclusively takes place as a kind of axiomatic variation within the standard DSGE modeling framework. And — no matter how many thousands of models mainstream economists come up with, as long as they are just axiomatic variations of the same old mathematical-deductive ilk, they will not take us one single inch closer to giving us relevant and usable means to further our understanding and explanation of real economies.

Wren-Lewis seems to have no problem with the lack of fundamantal diversity — not just path-dependent elaborations of the mainstream canon — and vanishingly little real world relevance that characterize modern mainstream macroeconomics.

Wren-Lewis obviously shares the view of his mainstream colleagues Paul Krugman and Greg Mankiw that there is nothing basically wrong with ‘standard theory.’ As long as policy makers and economists stick to ‘standard economic analysis’ — DSGE — everything is fine. Economics is just a common language and method that makes us think straight and reach correct answers.

And just as his colleagues, when it really counts, Wren-Lewis shows what he is — a mainstream neoclassical economist fanatically defending the insistence of using an axiomatic-deductive economic modeling strategy. To yours truly, this attitude is nothing but a late confirmation of Alfred North Whitehead’s complaint that ‘the self-confidence of learned people is the comic tragedy of civilization.’
 

Added January 15: Wren-Lewis has a post up today commenting on some of the critique put forward here. Writes Wren-Lewis:

My post ended with the following sentence:

‘Mainstream academic macro is very eclectic in the range of policy questions it can address, and conclusions it can arrive at, but in terms of methodology it is quite the opposite.’

I argue in the post that “this non-eclecticism in terms of excluding non-microfounded work is deeply problematic.” I then link to my many earlier posts where I have expanded on this theme. So how I can be a fanatic defender of insisting that this modelling strategy be used escapes me. Unless I have misunderstood what an ‘axiomatic-deductive’ strategy is.

Yes indeed, that is a total misunderstanding!

I have two PhDs. I am a professor. I can read.

So, of course, the issue is not about microfoundations or not (on which I have written plenty elsewhere, e.g. here). What I criticize Wren-Lewis and other mainstream economists for, is the insistence of using axiomatic-deductive modeling (with or without microfoundations). And I am — in case anyone thought otherwise — not alone in that critique:

The fundamental problem of modern economics is that methods are repeatedly applied in conditions for which they are not appropriate … Specifically, modern academic economics is dominated by a mainstream tradition whose defining characteristic is an insistence that certain methods of mathematical modelling be more or less always employed in the analysis of economic phenomena, and are so in conditions for which they are not suitable.

tony-lawsonFundamental to my argument is an assessment that the application of mathematics involves more than merely the introduction of a formal language. Of relevance here is recognition that mathematical methods and techniques are essentially tools. And as with any other tools (pencils, hammers, drills, scissors), so the sorts of mathematical methods which economists wield (functional relations, forms of calculus, etc.) are useful under some sets of conditions and not others …

Clearly if social phenomena are highly internally related they do not each exist in isolation. And if they are processual in nature, being continually transformed through practice, they are not atomistic. So the emphasis on the sorts of mathematical modelling methods that economists employ necessarily entails the construction of economic narratives – including the sorts of axioms and assumptions made and hypotheses entertained – that, at best, are always but highly distorted accounts of the complex phenomena of the real open social system … It is thus not at all surprising that mainstream contributions are found continually to be so unrealistic and explanatorily limited.

Employing the term deductivism to denote the thesis that closed systems are essential to social scientific explanation (whether the event regularities, correlations, uniformities, laws, etc., are either a prior constructions or a posterior observations), I conclude that the fundamental source of the discipline’s numerous, widespread and long lived problems and failings is precisely the emphasis placed upon forms of mathematical deductivist reasoning.

Tony Lawson

Heroes

13 January, 2016 at 09:27 | Posted in Economics, Varia | Comments Off on Heroes

 

Life on Mars

11 January, 2016 at 23:08 | Posted in Varia | Comments Off on Life on Mars

 

Forecasting econometrics

10 January, 2016 at 11:23 | Posted in Statistics & Econometrics | 1 Comment

411e9aO5PCL._SY344_BO1,204,203,200_There have been over four decades of econometric research on business cycles … The formalization has undeniably improved the scientific strength of business cycle measures …

But the significance of the formalization becomes more difficult to identify when it is assessed from the applied perspective, especially when the success rate in ex-ante forecasts of recessions is used as a key criterion. The fact that the onset of the 2008 financial-crisis-triggered recession was predicted by only a few ‘Wise Owls’ … while missed by regular forecasters armed with various models serves us as the latest warning that the efficiency of the formalization might be far from optimal. Remarkably, not only has the performance of time-series data-driven econometric models been off the track this time, so has that of the whole bunch of theory-rich macro dynamic models developed in the wake of the rational expectations movement, which derived its fame mainly from exploiting the forecast failures of the macro-econometric models of the mid-1970s recession.

The limits of econometric forecasting has, as noted by Qin, been critically pointed out many times before.

Trygve Haavelmo — with the completion (in 1958) of the twenty-fifth volume of Econometrica — assessed the the role of econometrics in the advancement of economics, and although mainly positive of the “repair work” and “clearing-up work” done, Haavelmo also found some grounds for despair:

We have found certain general principles which would seem to make good sense. Essentially, these principles are based on the reasonable idea that, if an economic model is in fact “correct” or “true,” we can say something a priori about the way in which the data emerging from it must behave. We can say something, a priori, about whether it is theoretically possible to estimate the parameters involved. And we can decide, a priori, what the proper estimation procedure should be … But the concrete results of these efforts have often been a seemingly lower degree of accuracy of the would-be economic laws (i.e., larger residuals), or coefficients that seem a priori less reasonable than those obtained by using cruder or clearly inconsistent methods.

Haavelmo-intro-2-125397_630x210There is the possibility that the more stringent methods we have been striving to develop have actually opened our eyes to recognize a plain fact: viz., that the “laws” of economics are not very accurate in the sense of a close fit, and that we have been living in a dream-world of large but somewhat superficial or spurious correlations.

And as the quote below shows, even Ragnar Frisch shared some of Haavelmo’s — and Keynes’s — doubts on the applicability of econometrics:

sp9997db.hovedspalteI have personally always been skeptical of the possibility of making macroeconomic predictions about the development that will follow on the basis of given initial conditions … I have believed that the analytical work will give higher yields – now and in the near future – if they become applied in macroeconomic decision models where the line of thought is the following: “If this or that policy is made, and these conditions are met in the period under consideration, probably a tendency to go in this or that direction is created”.

Ragnar Frisch

Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge. And, more specifically,  when it comes to forecasting activities, the results have been bleak indeed.

Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes.

A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometric forecasting rather useless.

Je suis Charlie encore aujourd’hui

9 January, 2016 at 11:18 | Posted in Politics & Society | Comments Off on Je suis Charlie encore aujourd’hui

Ska vi kunna leva tillsammans, människor från olika länder, kulturer och religioner, så måste det finnas tydliga spelregler:

Detta gäller i Sverige. För alla.

Respekt för lagen, för demokrati och mänskliga rättigheter, jämlikhet och jämställdhet, allas lika värde och individens rätt att välja sitt liv. När väl det är på plats står det var och en fritt att tillbe sin gud, bära slöja och fira jul, pesach, eid al-fitr. Eller låta bli.

54adfbdacf62d.image
 
Det är särskilt viktigt nu, när flyktingströmmen fört så många nya människor hit … Redan hörs oroande rapporter från invandrartäta områden: skäggmän kringskär kvinnors frihet, har åsikt om klädsel och vandel, här som i det gamla hemlandet.

Oacceptabelt är bara förnamnet … Ingen får tumma på kvinnors mänskliga rättigheter, oavsett religion, kultur och familjeförhållanden. Ingen får frånta unga rätten till sexualundervisning. Ingen får vända bort blicken när en tjej inte kommer tillbaka efter semestern i det gamla hemlandet – eller återvänder bortlovad. Ingen får blunda för att inte alla unga i Sverige tillåts välja sitt liv. Ingen får böja sig för patriarkala kulturers urtida krav på att få kuva kvinnor, kräva lydnad och kyskhet, beskriva släktens heder utifrån kvinnornas dygd. Inte heller svenska myndigheter som vill slippa bråk och kallar det “respekt”.

Detta kräver tydliga spelregler och stort mod. Men det är alldeles nödvändigt för vårt sätt att leva, tillsammans i frihet. Och ett bra sätt att fira 250 år av fria ord.

Heidi Avellan/SDS

Kultur, identitet, etnicitet, genus, religiositet får aldrig accepteras som grund för intolerans i politiska och medborgerliga hänseenden. I ett modernt demokratiskt samhälle måste människor som tillhör dessa olika grupper kunna räkna med att samhället också skyddar dem mot intoleransens övergrepp. Alla medborgare måste ha friheten och rätten att också ifrågasätta och lämna den egna gruppen. Mot dem som inte accepterar den toleransen måste vi vara intoleranta.

I Sverige har vi länge okritiskt omhuldat en ospecificerad och odefinierad mångkulturalism. Om vi med mångkulturalism menar att det i vårt samhälle finns flera olika kulturer ställer detta inte till med problem. Då är vi alla mångkulturalister.

Men om vi med mångkulturalism menar att det med kulturell tillhörighet och identitet också kommer specifika moraliska, etiska och politiska rättigheter och skyldigheter, talar vi om något helt annat. Då talar vi om normativ mångkulturalism. Och att acceptera normativ mångkulturalism, innebär också att tolerera oacceptabel intolerans, eftersom den normativa mångkulturalismen innebär att specifika kulturella gruppers rättigheter kan komma att ges högre dignitet än samhällsmedborgarens allmänmänskliga rättigheter – och därigenom indirekt bli till försvar för dessa gruppers (eventuella) intolerans. I ett normativt mångkulturalistiskt samhälle kan institutioner och regelverk användas för att inskränka människors frihet utifrån oacceptabla och intoleranta kulturella värderingar.

Den normativa mångkulturalismen innebär precis som främlingsfientlighet och rasism att individer på ett oacceptabelt sätt reduceras till att vara passiva medlemmar av kultur- eller identitetsbärande grupper. Men tolerans innebär inte att vi måste ha en värderelativistisk inställning till identitet och kultur. De som i vårt samhälle i handling visar att de inte respekterar andra människors rättigheter, kan inte räkna med att vi ska vara toleranta mot dem. De som med våld vill tvinga andra människor att underordna sig en speciell grupps religion, ideologi eller ”kultur” är själva ansvariga för den intolerans de måste bemötas med.

Om vi ska värna om det moderna demokratiska samhällets landvinningar måste samhället vara intolerant mot den intoleranta normativa mångkulturalismen. Och då kan inte samhället själv omhulda en normativ mångkulturalism. I ett modernt demokratiskt samhälle måste rule of law gälla – och gälla alla!

Mot dem som i vårt samhälle vill tvinga andra att leva efter deras egna religiösa, kulturella eller ideologiska trosföreställningar och tabun, ska samhället vara intolerant. Mot dem som vill tvinga samhället att anpassa lagar och regler till den egna religionens, kulturens eller gruppens tolkningar, ska samhället vara intolerant. Mot dem som i handling är intoleranta ska vi inte vara toleranta.

Next Page »

Blog at WordPress.com.
Entries and comments feeds.