Keynes vs. Samuelson on models

31 Jan, 2016 at 14:44 | Posted in Economics | 1 Comment

411WDSW5BRL._SX331_BO1,204,203,200_To his credit Keynes was not, in contrast to Samuelson, a formalist who was committed to mathematical economics. Keynes wanted models, but for him, building them required ‘ a vigilant observation of the actual working of our system.’ Indeed, ‘to convert a model into a quantitative formula is to destroy its usefulness as an instrument of thought.’ That conclusion can be strongly endorsed!
 

Economics — still in the land of Mordor

30 Jan, 2016 at 12:00 | Posted in Economics | Comments Off on Economics — still in the land of Mordor

When it comes to my economics training, I’m a late bloomer. My primary training is in evolutionary theory, which I have used as a navigational guide to study many human-related topics, such as religion. But I didn’t tackle economics until 2008 …

At the time I had no way to answer this question. Economic jargon mystified me—an embarrassing confession, since I am fully at home with mathematical and computer simulation models. Economists were very smart, very powerful, and they spoke a language that I didn’t understand. They won Nobel Prizes.

Nevertheless, I had faith that evolution could say something important about the regulatory systems that economists preside over, even if I did not yet know the details …

Fortunately, I had a Fellowship of the Ring to rely upon … Some of my closest colleagues are highly respected economists, Herbert Gintis, Samuel Bowles, and Ernst Fehr …

I already knew from their work that the main body of modern economics, called neoclassical economics, was being challenged by a new school of thought called experimental and behavioural economics …

63139459I was disappointed. My colleagues such as Herb, Sam, and Ernst confirmed my own impression: They appreciated the relevance of evolution but were a tiny minority among behavioral and experimental economists, who in turn were a tiny minority among neoclassical economists …

The more I learned about economics, the more I discovered a landscape that is surpassingly strange. Like the land of Mordor, it is dominated by a single theoretical edifice that arose like a volcano early in the 20th century and still dominates the landscape. The edifice is based upon a conception of human nature that is profoundly false, defying the dictates of common sense, before we even get to the more refined dictates of psychology and evolutionary theory. Yet, efforts to move the theory in the direction of common sense are stubbornly resisted.

David Sloan Wilson

[h/t Tom Hickey]

Good advice

30 Jan, 2016 at 11:31 | Posted in Varia | Comments Off on Good advice

‘If you really want something, you have to be prepared to work very hard, take advantage of opportunity, and above all — never give up.’
 

[h/t Ulrika Hall]

At the age of thirty-seven

29 Jan, 2016 at 21:34 | Posted in Varia | Comments Off on At the age of thirty-seven

 

Still absolutely breathtakingly great!

LOGIC of science vs. METHODS of science

29 Jan, 2016 at 17:19 | Posted in Theory of Science & Methodology | Comments Off on LOGIC of science vs. METHODS of science

 

Manfred Mann

29 Jan, 2016 at 09:15 | Posted in Varia | Comments Off on Manfred Mann

 

Against multiple regression analysis

28 Jan, 2016 at 18:35 | Posted in Statistics & Econometrics | 2 Comments

Distinguished social psychologist Richard E. Nisbett has a somewhat atypical aversion to multiple regression analysis . In his Intelligence and How to Get It (Norton 2011) he wrote (p. 17):

iqResearchers often determine the individual’s contemporary IQ or IQ earlier in life, socioeconomic status of the family of origin, living circumstances when the individual was a child, number of siblings, whether the family had a library card, educational attainment of the individual, and other variables, and put all of them into a multiple-regression equation predicting adult socioeconomic status or income or social pathology or whatever. Researchers then report the magnitude of the contribution of each of the variables in the regression equation, net of all the others (that is, holding constant all the others). It always turns out that IQ, net of all the other variables, is important to outcomes. But … the independent variables pose a tangle of causality – with some causing others in goodness-knows-what ways and some being caused by unknown variables that have not even been measured. Higher socioeconomic status of parents is related to educational attainment of the child, but higher-socioeconomic-status parents have higher IQs, and this affects both the genes that the child has and the emphasis that the parents are likely to place on education and the quality of the parenting with respect to encouragement of intellectual skills and so on. So statements such as “IQ accounts for X percent of the variation in occupational attainment” are built on the shakiest of statistical foundations. What nature hath joined together, multiple regressions cannot put asunder.

And now he is back with a half an hour lecture — The Crusade Against Multiple Regression Analysis — posted on The Edge website a week ago (watch the lecture here).

Now, I think that what Nisbett says is right as far as it goes, although it would certainly have strengthened Nisbett’s argumentation if he had elaborated more on the methodological question around causality, or at least had given some mathematical-statistical-econometric references. Unfortunately, his alternative approach is not more convincing than regression analysis. As so many other contemporary social scientists today, Nisbett seems to think that randomization may solve the empirical problem. By randomizing we are getting different “populations” that are homogeneous in regards to all variables except the one we think is a genuine cause. In this way we are supposed to be able to not have to actually know what all these other factors are.

If you succeed in performing an ideal randomization with different treatment groups and control groups that is attainable. But it presupposes that you really have been able to establish – and not just assume – that the probability of all other causes but the putative have the same probability distribution in the treatment and control groups, and that the probability of assignment to treatment or control groups are independent of all other possible causal variables.

Unfortunately, real experiments and real randomizations seldom or never achieve this. So, yes, we may do without knowing all causes, but it takes ideal experiments and ideal randomizations to do that, not real ones.

As I have argued — e. g. here — that means that in practice we do have to have sufficient background knowledge to deduce causal knowledge. Without old knowledge, we can’t get new knowledge – and, no causes in, no causes out.

Nisbett is well worth reading and listening to, but on the issue of the shortcomings of multiple regression analysis, no one sums it up better than eminent mathematical statistician David Freedman in his Statistical Models and Causal Inference:

If the assumptions of a model are not derived from theory, and if predictions are not tested against reality, then deductions from the model must be quite shaky. However, without the model, the data cannot be used to answer the research question …

In my view, regression models are not a particularly good way of doing empirical work in the social sciences today, because the technique depends on knowledge that we do not have. Investigators who use the technique are not paying adequate attention to the connection – if any – between the models and the phenomena they are studying. Their conclusions may be valid for the computer code they have created, but the claims are hard to transfer from that microcosm to the larger world …

Regression models often seem to be used to compensate for problems in measurement, data collection, and study design. By the time the models are deployed, the scientific position is nearly hopeless. Reliance on models in such cases is Panglossian …

Given the limits to present knowledge, I doubt that models can be rescued by technical fixes. Arguments about the theoretical merit of regression or the asymptotic behavior of specification tests for picking one version of a model over another seem like the arguments about how to build desalination plants with cold fusion and the energy source. The concept may be admirable, the technical details may be fascinating, but thirsty people should look elsewhere …

Causal inference from observational data presents may difficulties, especially when underlying mechanisms are poorly understood. There is a natural desire to substitute intellectual capital for labor, and an equally natural preference for system and rigor over methods that seem more haphazard. These are possible explanations for the current popularity of statistical models.

Indeed, far-reaching claims have been made for the superiority of a quantitative template that depends on modeling – by those who manage to ignore the far-reaching assumptions behind the models. However, the assumptions often turn out to be unsupported by the data. If so, the rigor of advanced quantitative methods is a matter of appearance rather than substance.

The force from above cleaning my soul

28 Jan, 2016 at 09:30 | Posted in Varia | Comments Off on The force from above cleaning my soul

 

Krugman — a Vichy Left coward?

27 Jan, 2016 at 23:53 | Posted in Politics & Society | Comments Off on Krugman — a Vichy Left coward?

cowardly-lionPaul Krugman’s recent posts have been most peculiar. Several have looked uncomfortably like special pleading for political figures he likes, notably Hillary Clinton. He has, in my judgement, stooped rather far down in attacking people well below him in the public relations food chain …

Perhaps the most egregious and clearest cut case is his refusal to address the substance of a completely legitimate, well-documented article by David Dayen outing Krugman, and to a lesser degree, his fellow traveler Mike Konczal, in abjectly misrepresenting Sanders’ financial reform proposals …

The Krugman that was early to stand up to the Iraq War, who was incisive before and during the crisis has been very much in absence since Obama took office. It’s hard to understand the loss of intellectual independence. That may not make Krugman any worse than other Democratic party apparatchiks, but he continues to believe he is other than that, and the lashing out at Dayen looks like a wounded denial of his current role. Krugman and Konczal need to be seen as what they are: part of the Vichy Left brand cover for the Democratic party messaging apparatus. Krugman, sadly, has chosen to diminish himself for a not very worthy cause.

Yves Smith/Naked Capitalism

Thatcher policies for dummies

27 Jan, 2016 at 15:47 | Posted in Politics & Society | Comments Off on Thatcher policies for dummies

CEl6MKTWMAETZxT

thatrcheronedirection

What is Post Keynesian economics?

27 Jan, 2016 at 11:05 | Posted in Economics | Comments Off on What is Post Keynesian economics?

 

Deduction — induction — abduction

25 Jan, 2016 at 14:42 | Posted in Theory of Science & Methodology | 3 Comments

 

mcgregor4_clip_image002_0000

In science – and economics – one could argue that there basically are three kinds of argumentation patterns/schemes/methods/strategies available:

Deduction

Premise 1: All Chicago economists believe in REH
Premise 2: Robert Lucas is a Chicago economist
—————————————————————–
Conclusion: Robert Lucas believes in REH

Here we have an example of a logically valid deductive inference (and, following Quine, whenever logic is used in this essay, ‘logic’ refers to deductive/analytical logic).

In a hypothetico-deductive reasoning — hypothetico-deductive confirmation in this case — we would use the conclusion to test the law-like hypothesis in premise 1 (according to the hypothetico-deductive model, a hypothesis is confirmed by evidence if the evidence is deducible from the hypothesis). If Robert Lucas does not believe in REH we have gained some warranted reason for non-acceptance of the hypothesis (an obvious shortcoming here being that further information beyond that given in the explicit premises might have given another conclusion).

The hypothetico-deductive method (in case we treat the hypothesis as absolutely sure/true, we rather talk of an axiomatic-deductive method) basically means that we

•Posit a hypothesis
•Infer empirically testable propositions (consequences) from it
•Test the propositions through observation or experiment
•Depending on the testing results either find the hypothesis corroborated or falsified.

However, in science we regularly use a kind of ‘practical’ argumentation where there is little room for applying the restricted logical ‘formal transformations’ view of validity and inference. Most people would probably accept the following argument as a ‘valid’ reasoning even though it from a strictly logical point of view is non-valid:

Premise 1: Robert Lucas is a Chicago economist
Premise 2: The recorded proportion of Keynesian Chicago economists is zero
————————————————————————–
Conclusion: So, certainly, Robert Lucas is not a Keynesian economist

How come? Well I guess one reason is that in science, contrary to what you find in most logic text-books, not very many argumentations are settled by showing that ‘All Xs are Ys.’ In scientific practice we instead present other-than-analytical explicit warrants and backings — data, experience, evidence, theories, models — for our inferences. As long as we can show that our ‘deductions’ or ‘inferences’ are justifiable and have well-backed warrants our colleagues listen to us. That our scientific ‘deductions’ or ‘inferences’ are logical non-entailments simply is not a problem. To think otherwise is committing the fallacy of misapplying formal-analytical logic categories to areas where they are pretty much irrelevant or simply beside the point.

Scientific arguments are not analytical arguments, where validity is solely a question of formal properties. Scientific arguments are substantial arguments. If Robert Lucas is a Keynesian or not, is nothing we can decide on formal properties of statements/propositions. We have to check out what the guy has actually been writing and saying to check if the hypothesis that he is a Keynesian is true or not.

In a deductive-nomological explanation — also known as a covering law explanation — we would try to explain why Robert Lucas believes in REH with the help of the two premises (in this case actually giving an explanation with very little explanatory value). These kinds of explanations — both in their deterministic and statistic/probabilistic versions — rely heavily on deductive entailment from assumed to be true premises. But they have preciously little to say on where these assumed to be true premises come from.

Deductive logic of confirmation and explanation may work well — given that they are used in deterministic closed models! In mathematics, the deductive-axiomatic method has worked just fine. But science is not mathematics. Conflating those two domains of knowledge has been one of the most fundamental mistakes made in the science of economics.  Applying it to real world systems, however, immediately proves it to be excessively narrow and hopelessly irrelevant. Both the confirmatory and explanatory ilk of hypothetico-deductive reasoning fails since there is no way you can relevantly analyze confirmation or explanation as a purely logical relation between hypothesis and evidence or between law-like rules and explananda. In science we argue and try to substantiate our beliefs and hypotheses with reliable evidence — proportional and predicate deductive logic, on the other hand, is not about reliability, but the validity of the conclusions given that the premises are true.

Deduction — and the inferences that goes with it — is an example of ‘explicative reasoning,’  where the conclusions we make are already included in the premises. Deductive inferences are purely analytical and it is this truth-preserving nature of deduction that makes it different from all other kinds of reasoning. But it is also its limitation, since truth in the deductive context does not refer to  a real world ontology (only relating propositions as true or false within a formal-logic system) and as an argument scheme is totally non-ampliative — the output of the analysis is nothing else than the input.
Continue Reading Deduction — induction — abduction…

‘New Keynesian’ DSGE models

24 Jan, 2016 at 10:28 | Posted in Economics | 2 Comments

In the model [Gali, Smets and Wouters, Unemployment in an Estimated New Keyesian Model (2011)] there is perfect consumption insurance among the members of the household. SR002_FRONTBecause of separability in utility, this implies that consumption is equalized across all workers, whether they are employed or not … Workers who find that they do not have to work are unemployed or out of the labor force, and they have cause to rejoice as a result. Unemployed workers enjoy higher utility than the employed because they receive the same level of consumption, but without having to work.

There is much evidence that in practice unemployment is not the happy experience it is for workers in the model.  For example, Chetty and Looney (2006) and Gruber (1997) find that US households suffer roughly a 10 percent drop in consumption when they lose their job. According to Couch and Placzek (2010), workers displaced through mass layoffs suffer substantial and extended reductions in earnings. Moreover, Oreopoulos, Page and Stevens (2008) present evidence that the children of displaced workers also suffer reduced earnings. Additional evidence that unemployed workers suffer a reduction in utility include the results of direct interviews, as well as findings that unemployed workers experience poor health outcomes. Clark and Oswald (1994), Oswald (1997) and Schimmack, Schupp and Wagner (2008) describe evidence that suggests unemployment has a negative impact on a worker’s self-assessment of well being. Sullivan and von Wachter (2009) report that the mortality rates of high-seniority workers jump 50-100% more than would have been expected otherwise in the year after displacement. Cox and Koo (2006) report a significant positive correlation between male suicide and unemployment in Japan and the United States. For additional evidence that unemployment is associated with poor health outcomes, see Fergusson, Horwood and Lynskey (1997) and Karsten and Moser (2009) …

Suppose the CPS [Current Population Survey] employee encountered one of the people designated as “unemployed” … and asked if she were “available for work”. What would her answer be? She knows with certainty that she will not be employed in the current period. Privately, she is delighted about this because the non-employed enjoy higher utility than the employed … Not only is she happy about not having to work, but the labor union also does not want her to work. From the perspective of the union, her non-employment is a fundamental component of the union’s strategy for promoting the welfare of its membership.

Lawrence J. Christiano

fubar1To me these kind of “New Keynesian” DSGE models, where unemployment is portrayed as a bliss, are a sign of a momentous failure to model real-world unemployment. It’s not only adding insult to injury — it’s also sad gibberish that shamelessly tries to whitewash neoliberal economic policies that put people out of work.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But models do not only face theory. They also have to look to the world. Being able to model a “credible” DSGE world — how credible that world is, when depicting unemployment as a “happy experience” and predicting the wage markup to increase with unemployment, I leave to the reader to decide — a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

The final court of appeal for macroeconomic models is the real world — and as long as no convincing justification is put forward for how the inferential bridging from models to reality de facto is made, macroeconomic model building is little more than “hand waving” that gives us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

On the non-neutrality of money

23 Jan, 2016 at 14:29 | Posted in Economics | 1 Comment

Paul Krugman has repeatedly over the years argued that we should continue to use neoclassical hobby horses like IS-LM and Aggregate Supply-Aggregate Demand models. Here’s one example:

So why do AS-AD? … We do want, somewhere along the way, to get across the notion of the self-correcting economy, the notion that in the long run, we may all be dead, but that we also have a tendency to return to full employment via price flexibility. Or to put it differently, you do want somehow to make clear the notion (which even fairly Keynesian guys like me share) that money is neutral in the long run.

I doubt that Keynes would have been impressed by having his theory being characterized with catchwords like “tendency to return to full employment” and “money is neutral in the long run.”

alfa

One of Keynes’s central tenets — in clear contradistinction to the beliefs of neoclassical economists — is that there is no strong automatic tendency for economies to move toward full employment levels in monetary economies.

Money doesn’t matter in neoclassical macroeconomic models. That’s true. But in the real world in which we happen to live, money does certainly matter. Money is not neutral and money matters in both the short run and the long run:

The theory which I desiderate would deal … with an economy in which money plays a part of its own and affects motives and decisions, and is, in short, one of the operative factors in the situation, so that the course of events cannot be predicted in either the long period or in the short, without a knowledge of the behaviour of money between the first state and the last. And it is this which we ought to mean when we speak of a monetary economy.

J. M. Keynes A monetary theory of production (1933)

Fadime Sahindal

21 Jan, 2016 at 09:02 | Posted in Politics & Society | Comments Off on Fadime Sahindal
Till Fadime Sahindal, född 2 april 1975 i Turkiet, mördad 21 januari 2002 i Sverige

fadimeI Sverige har vi länge okritiskt omhuldat en ospecificerad och odefinierad mångkulturalism. Om vi med mångkulturalism menar att det med kulturell tillhörighet och identitet också kommer specifika moraliska, etiska och politiska rättigheter och skyldigheter, talar vi om normativ multikulturalism. Att acceptera normativ mångkulturalism, innebär att också tolerera oacceptabel intolerans, eftersom den normativa mångkulturalismen innebär att specifika kulturella gruppers rättigheter kan komma att ges högre dignitet än samhällsmedborgarens allmänmänskliga rättigheter – och därigenom indirekt blir till försvar för dessa gruppers intolerans.

Den normativa mångkulturalismen innebär att individer på ett oacceptabelt sätt reduceras till att vara passiva medlemmar av kultur- eller identitetsbärande grupper. Men de som i vårt samhälle visar att de inte respekterar andra människors rättigheter, kan inte räkna med att vi ska vara toleranta mot dem.

Mot dem som i vårt samhälle vill tvinga andra att leva efter deras egna religiösa, kulturella eller ideologiska trosföreställningar och tabun, ska samhället vara intolerant. Mot dem som vill tvinga samhället att anpassa lagar och regler till den egna religionens, kulturens eller gruppens tolkningar, ska samhället vara intolerant.


DE DÖDA

De döda skall icke tiga men tala.
Förskingrad plåga skall finna sin röst,
och när cellernas råttor och mördarnas kolvar
förvandlats till aska och urgammalt stoft
skall kometens parabel och stjärnornas vågspel
ännu vittna om dessa som föll mot sin mur:
tvagna i eld men inte förbrunna till glöd,
förtrampade slagna men utan ett sår på sin kropp,
och ögon som stirrat i fasa skall öppnas i frid,
och de döda skall icke tiga men tala.

Om de döda skall inte tigas men talas.
Fast stympade strypta i maktens cell,
glasartade beledda i cyniska väntrum
där döden har klistrat sin freds propaganda,
skall de vila länge i samvetets montrar.
balsamerade av sanning och tvagna i eld,
och de som redan har stupat skall icke brytas,
och den som tiggde nåd i ett ögonblicks glömska
skall resa sig och vittna om det som inte brytes,
för de döda skall inte tiga men tala.

Nej, de döda skall icke tiga men tala.
De som kände triumf på sin nacke skall höja sitt huvud,
och de som kvävdes av rök skall se klart,
de som pinades galna skall flöda som källor,
de som föll för sin motsats skall själva fälla,
de som dräptes med bly skall dräpa med eld,
de som vräktes av vågor skall själva bli storm.
Och de döda skall icke tiga men tala.

                                           Erik Lindegren

Next Page »

Blog at WordPress.com.
Entries and Comments feeds.