K. in memoriam (private)

16 May, 2013 at 05:39 | Posted in Varia | Comments Off on K. in memoriam (private)

Today, exactly twenty years ago, the unimaginable happened.

Some people say time heals all wounds. I know that’s not true. Some wounds never heal. You just learn to live with the scars.

No blogging today.
 

Top Economics Blogs

15 May, 2013 at 19:13 | Posted in Varia | 1 Comment

Objectively, blogs are subjective, so coming up with a list of the top 10, top 15, or top 100 economics blogs is no easy undertaking. Economics bloggers vary widely from individual students and professors sharing their thoughts on current events, new research or the state of the profession to the blogging superstars like Greg Mankiw, Paul Krugman and Tyler Cowen.

Instead of trying to rank the blogs, we are simply going to list some of our favourites. These are the blogs to which we turn when looking for interesting, informative, and offbeat articles to share. All of these blogs provide some insight into the economics profession and we at INOMICS enjoy going through them and sharing the most interesting articles each day with our readers, especially on Twitter.

Aguanomics Evolving Economics
Angry Bear Ezra Klein’s Wonkblog
Askblog Felix Salmon
Becker-Posner Blog Freakonomics
Cafe Hayak Greg Mankiw’s Blog
Calculated Risk Lars P. Syll
Carpe Diem Macro and Other Market Musings
Cheap Talk Mainly Macro
Confessions of a Supply Side Liberal Marginal Revolution
Conversable Economist Market Design
Core Economics Modeled Behavior
Curious Cat Naked Capitalism
Don’t worry, I’m an economist NEP-HIS Blog
Econbrowser New Economic Perspectives
EconLog Noahpinion
Econometrics Beat Overcoming Bias
Economic Incentives Paul Krugman
Economic Logic Real Time Economics
Economist’s View  Real World Economics Review
Economists Do It With Models Steve Keen’s Debtwatch
Economix The Market Monetarist
Economonitor Thoughts on Economics
Econospeak Tim Harford
Ed Dolan’s Econ Blog Vox EU
  Worthwhile Canadian Initiative

 

INOMICS BLOG

Grossman-Stiglitz-paradoxen

15 May, 2013 at 13:59 | Posted in Economics, Theory of Science & Methodology | Comments Off on Grossman-Stiglitz-paradoxen

paradox4Inom informationsekonomin är Hayeks ”The Use of Knowledge in Society” (American Economic Review 1945) och Grossman & Stiglitz ”On the Impossibility of Informationally Efficient Markets” (American Economic Review 1980) två klassiker. Men medan Hayeks artikel ofta åberopas av nyösterrikiskt influerade ekonomer har neoklassiska nationalekonomer sällan något att säga om Grossman & Stiglitz:s artikel. Jag tror inte det är en tillfällighet

Ett av de mest avgörande antaganden som görs i den ortodoxa ekonomiska teorin är att ekonomins aktörer utan kostnad besitter fullständig information. Detta är ett antagande som heterodoxa ekonomer under lång tid ifrågasatt.

Neoklassiska ekonomer är självklart medvetna om att antagandet om perfekt information är orealistiskt i de flesta sammanhang. Man försvarar ändå användandet av det i sina formella modeller med att verkliga ekonomier, där informationen inte är fullt så perfekt, ändå inte skiljer sig på något avgörande sätt från de egna modellerna. Vad informationsekonomin visat är dock att de resultat som erhålls i modeller som bygger på perfekt information inte är robusta. Även en liten grad av informationsimperfektion har avgörande inverkan på ekonomins jämvikt. Detta har i och för sig också påvisats tidigare, av t ex transaktionskostnadsteorin. Dess representanter har dock snarare dragit slutsatsen att om man bara tar hänsyn till dessa kostnader i analysen så är standardresultaten i stort ändå intakta. Informationsekonomin visar på ett övertygande sätt att så inte är fallet. På område efter område har man kunnat visa att den ekonomiska analysen blir kraftigt missvisande om man bortser från asymmetrier i information och kostnader för att erhålla den. Bilden av marknader (och behovet av eventuella offentliga ingripanden) blir väsentligt annorlunda än i modeller som bygger på standardantagandet om fullständig information.

Grossman-Stiglitz-paradoxen visar att om marknaden vore effektiv – om priser fullt ut reflekterar tillgänglig information – skulle ingen aktör ha incitament att skaffa den information som priserna förutsätts bygga på. Om å andra sidan ingen aktör är informerad skulle det löna sig för en aktör att införskaffa information. Följaktligen kan marknadspriserna inte inkorporera all relevant information om de nyttigheter som byts på marknaden. Självklart är detta för (i regel marknadsapologetiska) neoklassiska nationalekonomer synnerligen störande. Därav ”tystnaden” kring artikeln och dess paradox!

Grossman & Stiglitz – precis som senare Frydman & Goldberg gör i Imperfect Knowledge Economics (Princeton University Press 2007) – utgår från och förhåller sig till Lucas et consortes. Deras värdering av det informationsparadigm som rationella förväntningar bygger på sammanfaller också så vitt jag kan bedöma helt med min egen. Det är ur relevans- och realismsynpunkt nonsens på styltor. Grossman-Stiglitz-paradoxen är kraftfull som ett yxhugg mot den neoklassiska roten. Det är därför den så gärna ”glöms” bort av neoklassiska ekonomer.

Hayek menade – se t ex kapitel 1 och 3 i Kunskap, konkurrens och rättvisa (Ratio 2003) – att marknader och dess prismekanismer bara har en avgörande roll att spela när information inte är kostnadsfri. Detta var en av huvudingredienserna i hans kritik av planhushållningsidén, då ju kostnadsfri information i princip skulle göra planhushållning och marknad likvärdiga (som så ofta är den nyösterrikiska bilden av marknader betydligt relevantare och mer realistisk än allehanda neoklassiska Bourbakikonstruktioner à la Debreu et consortes).

Kruxet med ”effektiva marknader” är – som Grossman & Stiglitz på ett lysande sätt visar – att de strikt teoretiskt bara kan föreligga när information är kostnadsfri. När information inte är gratis kan priset inte perfekt återspegla mängden tillgänglig information (nota bene – detta gäller vare sig asymmetrier föreligger eller ej).

Den i mitt tycke intressantaste funderingen utifrån Grossman & Stiglitz blir vad vi har för glädje av teorier som bygger på ”brusfria” marknader, när de inte bara är hopplöst orealistiska utan också visar sig vara teoretiskt inkonsistenta.

Trams är trams om än i vackra dosor.

Lacrimosa

12 May, 2013 at 19:54 | Posted in Varia | Comments Off on Lacrimosa

 

Van den Budenmayer

12 May, 2013 at 19:37 | Posted in Varia | Comments Off on Van den Budenmayer

 

Though I speak with the tongues of angels

11 May, 2013 at 23:15 | Posted in Varia | Comments Off on Though I speak with the tongues of angels

 

Though I speak with the tongues of angels,
If I have not love…
My words would resound with but a tinkling cymbal.
And though I have the gift of prophesy…
And understand all mysteries…
and all knowledge…
And though I have all faith
So that I could remove mountains,
If I have not love…
I am nothing.
Love is patient, full of goodness;
Love tolerates all things,
Aspires to all things,
Love never dies,
while the prophecies shall be done away,
tongues shall be silenced,
knowledge shall fade…
thus then shall linger only
faith, hope, and love…
but the greatest of these…
is love.

On tour

11 May, 2013 at 18:40 | Posted in Varia | Comments Off on On tour

Touring again. Conference in Stockholm and guest appearence in the parliament. Regular blogging will be resumed next week.

sodermalm

100% Wrong on 90%

10 May, 2013 at 16:42 | Posted in Economics | 1 Comment

 

(h/t Simsalablunder)

Ronald Coase – still making sense at 102 (!)

10 May, 2013 at 13:49 | Posted in Economics | Comments Off on Ronald Coase – still making sense at 102 (!)

coaseIn the 20th century, economics consolidated as a profession; economists could afford to write exclusively for one another. At the same time, the field experienced a paradigm shift, gradually identifying itself as a theoretical approach of economization and giving up the real-world economy as its subject matter.

But because it is no longer firmly grounded in systematic empirical investigation of the working of the economy, it is hardly up to the task … Today, a modern market economy with its ever-finer division of labor depends on a constantly expanding network of trade. It requires an intricate web of social institutions to coordinate the working of markets and firms across various boundaries. At a time when the modern economy is becoming increasingly institutions-intensive, the reduction of economics to price theory is troubling enough. It is suicidal for the field to slide into a hard science of choice, ignoring the influences of society, history, culture, and politics on the working of the economy.

Ronald Coase, Saving Economics from the Economists

Reinhart-Rogoff och EU:s normpolitiska åtstramning

10 May, 2013 at 10:44 | Posted in Economics, Politics & Society | 1 Comment

Det brukar kallas normpolitik, föreställningen att det finns tumregler som hjälper finansministrar och centralbankschefer att fatta rätt beslut. Ekonomi är ju, när allt kommer omkring, för svårt och komplicerat för vanliga politiker, för att inte tala om okunniga medborgare. Bättre då att låta normer automatiskt fatta de rätta, de nödvändiga besluten. Regler är dessutom okänsliga för opinion, till skillnad från politiker som alltför ofta försöker infria de vallöften de har gett sina väljare.

hermeleJag skämtar, men ämnet är allvarligt: under de senaste tjugo åren har normer och regler kommit att dominera den faktiskt förda ekonomiska politiken. Ett av de tydligaste tecknen är EU:s Maastrichtfördrag från 1992 som lade grunden för den gemensamma valutan, euron. För att få komma med i euron måste länder uppfylla så kallade konvergenskriterier: inflationen måste vara under 2 procent, budgetunderskottet inte större än 3 procent av bnp, och statsskulden ska inte överstiga 60 procent av bnp. Lägg märke till att ingenting sägs om hur hög arbetslösheten får vara …

Det är i det sammanhanget man kan förstå uppmärksamheten kring en kort artikel av ekonomerna Carmen Reinhart och Kenneth Rogoff – ”Growth in a Time of Debt”, publicerad i American Economic Review 2010. Reinhart och Rogoff hävdar att gränsen för en ansvarsfull politik går vid en historiskt fastställd ”skuldtröskel” om 90 procent av bnp …

För några veckor sedan avslöjade dock en granskning genomförd av Thomas Herndon vid Massachusetts universitet i Amherst att Reinhart och Rogoffs ”skuldtröskel” bygger på två fel: dels har de glömt att ta med flera länder som ingick i deras databas när de sammanfattar sin statistik – ett misstag som författarna erkänt – dels hade de räknat varje tillväxt eller nedgång lika oavsett hur länge den varat … Det leder hela resultatet i konservativ riktning, mot lägre skuldsättning och mer åtstramning …

Det allvarliga med Reinhart och Rogoffs felaktiga artikel är inte att den skulle leda till omedelbara drakoniska åtstramningar över hela linjen.

Nej, det verkligt oroande i deras nu avklädda räkneövningar är att de rättfärdigar normpolitik, som om det faktiskt skulle gå att fastställa vad som är en rimlig skuldsättning eller inflation en gång för alla, för alla länder, för alla tider. Därmed stöder Reinhart och Rogoff alla dem som vill minska valda församlingars och valda politikers handlingsutrymme. Normerna och tumreglerna visar sig alltså vara i högsta grad politiska, inte objektiva, vetenskapligt fastställda sanningar.

Det är något att lägga på minnet för de krisande euro-länderna.

Kenneth Hermele

Bob Pollin responds to Reinhart and Rogoff

9 May, 2013 at 09:27 | Posted in Economics | 1 Comment

 

(h/t Jan Milch)

Niall Ferguson – the peevish hole digger

8 May, 2013 at 22:27 | Posted in Varia | 2 Comments

dig a holeYou would think that a Harvard historian would know about the First Law of Holes: When in a hole, stop digging.

But Harvard historian Niall Ferguson dug his own hole of trouble a bit deeper, in “An Open Letter To The Harvard Community” posted at the Harvard Crimson’s website on Tuesday. In the letter, Ferguson apologizes profusely for recent dumb statements he made about the legendary economist John Maynard Keynes. In the process, Ferguson makes several more dumb statements.

In case you missed it, Ferguson last week declared that Keynes’ homosexuality had left him childless, making Keynes care nothing about the future and leading him to suggest that governments should spend their way out of economic downturns, which is why he is history’s greatest monster. Suck it, logic! At last conservatives had a Unified Theory Of Gay to explain all that has gone wrong with the world for the past 80 years or so.

Of course, most oxygen-breathing creatures immediately recoiled at the 100 or so varieties of stupid in Ferguson’s statement and reacted with fury and scorn. Like Ron Burgundy after he jumped into the Kodiak bear pit to save Veronica Corningstone, Ferguson immediately regretted his decision. In a statement on his website on Saturday, he offered an “Unqualified Apology,” admitting his comments were “doubly stupid” — not only do childless people care about the future, but Keynes’s wife had suffered a miscarriage, he pointed out. I would add that gay people can also have children, which makes Ferguson’s comments at least trebly stupid. But anyway, Ferguson’s apology was indeed appropriately unqualified.

But he just couldn’t shut up about it. He seems to have been baited into commenting further after Berkeley economist Brad DeLong and others noted that Ferguson had previously commented on Keynes’ sexuality, back in 1995. Ferguson’s “Open Letter” now addresses those claims. While purporting to be an apology, it is not unqualified at all. Instead, it turns into an exercise in peevishness and self-defensiveness.

Mark Gongloff

Niall Ferguson’s apology is an epic fail

8 May, 2013 at 13:21 | Posted in Varia | 17 Comments

fergusonFerguson’s “unreserved” apology is nothing of the kind. He does not apologize for his past efforts to smear Keynes. He tries to make it appear that the latest smear was a one-off, unthinking quip. It was neither. He apologizes for being “insensitive.” What could that mean in this context where he is supposedly agreeing that what he said was false – not true but “insensitive?” Ferguson simply made up the part about Keynes and “poetry.” Ferguson’s spreading of homophobic tropes isn’t “insensitive” in this context – it’s false and it is nasty.

Ferguson apologizes for forgetting that Keynes’ wife suffered a miscarriage. But what is the relevance of that fact to Ferguson’s smear or apology? Is he saying that the pregnancy falsifies his implicit smear that Keynes wasn’t “man enough” to have sex with a woman? Did he think gay or bisexual males were sterile or impotent? Why did he emphasize his claim that Keynes married “a ballerina?”

Why didn’t Ferguson apologize for his substantive misstatements? As a historian who has read Keynes he knows that Keynes’ quip about “in the long run we are all dead” had absolutely nothing to do with claiming that the longer-term health of the economy was unimportant or a matter in which Keynes was uninterested.

William K. Black

Added 8/5: And as if this wasn’t enough, Ferguson now has an article in The Harvard Crimson where he accuses his critics of being “among the most insidious enemies of academic freedom.” Read it yourself, but it’s in my view even more pathetic than his original statements. Who can take this guy seriously anymore? I for one certainly can’t.

Modern econometrics – a critical realist critique (wonkish)

7 May, 2013 at 14:47 | Posted in Statistics & Econometrics | 2 Comments

Neoclassical economists often hold the view that criticisms of econometrics are the conclusions of sadly misinformed and misguided people who dislike and do not understand much of it. This is really a gross misapprehension. To be careful and cautious is not the same as to dislike. keuzenkampAnd as any perusal of the mathematical-statistical and philosophical works of people like for example Nancy Cartwright, Chris Chatfield, Hugo Keuzenkamp, John Maynard Keynes or Tony Lawson would show, the critique is put forward by respected authorities. I would argue, against “common knowledge”, that they do not misunderstand the crucial issues at stake in the development of econometrics. Quite the contrary. They know them all too well – and are not satisfied with the validity and philosophical underpinning of the assumptions made for applying its methods.

Let me try to do justice to the critical arguments on the logic of probabilistic induction and shortly elaborate – mostly from a philosophy of science vantage point – on some insights critical realism gives us on econometrics and its methodological foundations.

The methodological difference between an empiricist and a deductivist approach can also clearly be seen in econometrics. The ordinary deductivist “textbook approach” views the modeling process as foremost an estimation problem, since one (at least implicitly) assumes that the model provided by economic theory is a well-specified and “true” model. The more empiricist, general-to-specific-methodology (often identified as “the LSE approach”) on the other hand views models as theoretically and empirically adequate representations (approximations) of a data generating process (DGP). Diagnostics tests (mostly some variant of the F-test) are used to ensure that the models are “true” – or at least “congruent” – representations of the DGP. The modeling process is here more seen as a specification problem where poor diagnostics results may indicate a possible misspecification requiring re-specification of the model. The objective is standardly to identify models that are structurally stable and valid across a large time-space horizon. The DGP is not seen as something we already know, but rather something we discover in the process of modeling it. Considerable effort is put into testing to what extent the models are structurally stable and generalizable over space and time.

Although I have sympathy for this approach in general, there are still some unsolved “problematics” with its epistemological and ontological presuppositions. There is, e. g., an implicit assumption that the DGP fundamentally has an invariant property and that models that are structurally unstable just have not been able to get hold of that invariance. But, as already Keynes maintained, one cannot just presuppose or take for granted that kind of invariance. It has to be argued and justified. Grounds have to be given for viewing reality as satisfying conditions of model-closure. It is as if the lack of closure that shows up in the form of structurally unstable models somehow could be solved by searching for more autonomous and invariable “atomic uniformity”. But if reality is “congruent” to this analytical prerequisite has to be argued for, and not simply taken for granted.

Even granted that closures come in degrees, we should not compromise on ontology. Some methods simply introduce improper closures, closures that make the disjuncture between models and real world target systems inappropriately large. “Garbage in, garbage out.”

Underlying the search for these immutable “fundamentals” lays the implicit view of the world as consisting of material entities with their own separate and invariable effects. These entities are thought of as being able to be treated as separate and addible causes, thereby making it possible to infer complex interaction from knowledge of individual constituents with limited independent variety. But, again, if this is a justified analytical procedure cannot be answered without confronting it with the nature of the objects the models are supposed to describe, explain or predict. Keynes himself thought it generally inappropriate to apply the “atomic hypothesis” to such an open and “organic entity” as the real world. As far as I can see these are still appropriate strictures all econometric approaches have to face. Grounds for believing otherwise have to be provided by the econometricians.

haavelmoTrygve Haavelmo, the “father” of modern probabilistic econometrics, wrote that he and other econometricians could not “build a complete bridge between our models and reality” by logical operations alone, but finally had to make “a non-logical jump” [1943:15]. A part of that jump consisted in that econometricians “like to believe … that the various a priori possible sequences would somehow cluster around some typical time shapes, which if we knew them, could be used for prediction” [1943:16]. But since we do not know the true distribution, one has to look for the mechanisms (processes) that “might rule the data” and that hopefully persist so that predictions may be made. Of possible hypothesis on different time sequences (“samples” in Haavelmo’s somewhat idiosyncratic vocabulary)) most had to be ruled out a priori “by economic theory”, although “one shall always remain in doubt as to the possibility of some … outside hypothesis being the true one” [1943:18].

To Haavelmo and his modern followers, econometrics is not really in the truth business. The explanations we can give of economic relations and structures based on econometric models are “not hidden truths to be discovered” but rather our own “artificial inventions”. Models are consequently perceived not as true representations of DGP, but rather instrumentally conceived “as if”-constructs. Their “intrinsic closure” is realized by searching for parameters showing “a great degree of invariance” or relative autonomy and the “extrinsic closure” by hoping that the “practically decisive” explanatory variables are relatively few, so that one may proceed “as if … natural limitations of the number of relevant factors exist” [Haavelmo 1944:29].

Haavelmo seems to believe that persistence and autonomy can only be found at the level of the individual, since individual agents are seen as the ultimate determinants of the variables in the economic system.

But why the “logically conceivable” really should turn out to be the case is difficult to see. At least if we are not satisfied by sheer hope. As we have already noted Keynes reacted against using unargued for and unjustified assumptions of complex structures in an open system being reducible to those of individuals. In real economies it is unlikely that we find many “autonomous” relations and events. And one could of course, with Keynes and from a critical realist point of view, also raise the objection that to invoke a probabilistic approach to econometrics presupposes, e. g., that we have to be able to describe the world in terms of risk rather than genuine uncertainty.

And that is exactly what Haavelmo [1944:48] does: “To make this a rational problem of statistical inference we have to start out by an axiom, postulating that every set of observable variables has associated with it one particular ‘true’, but unknown, probability law.”

But to use this “trick of our own” and just assign “a certain probability law to a system of observable variables”, however, cannot – just as little as hoping – build a firm bridge between model and reality. Treating phenomena as if they essentially were stochastic processes is not the same as showing that they essentially are stochastic processes. As Hicks [1979:120-21] so neatly puts it:

Things become more difficult when we turn to time-series … The econometrist, who works in that field, may claim that he is not treading on very shaky ground. But if one asks him what he is really doing, he will not find it easy, even here, to give a convincing answer … [H]e must be treating the observations known to him as a sample of a larger “population”; but what population? … I am bold enough to conclude, from these considerations that the usefulness of “statistical” or “stochastic” methods in economics is a good deal less than is now conventionally supposed. We have no business to turn to them automatically; we should always ask ourselves, before we apply them, whether they are appropriate to the problem in hand.”

And as if this wasn’t enough, one could also seriously wonder what kind of “populations” these statistical and econometric models ultimately are based on. Why should we as social scientists – and not as pure mathematicians working with formal-axiomatic systems without the urge to confront our models with real target systems – unquestioningly accept Haavelmo’s “infinite population”, Fisher’s “hypothetical infinite population”, von Mises’s “collective” or Gibbs’s ”ensemble”?

Of course one could treat our observational or experimantal data as random samples from real populations. I have no problem with that. But modern (probabilistic) econometrics does not content itself with that kind of populations. Instead it creates imaginary populations of “parallel universes” and assume that our data are random samples from that kind of populations.

But this is actually nothing else but handwaving! And it is inadequate for real science. As David Freedman writes in Statistical Models and Causal Inference:

With this approach, the investigator does not explicitly define a population that could in principle be studied, with unlimited resources of time and money. The investigator merely assumes that such a population exists in some ill-defined sense. And there is a further assumption, that the data set being analyzed can be treated as if it were based on a random sample from the assumed population. These are convenient fictions … Nevertheless, reliance on imaginary populations is widespread. Indeed regression models are commonly used to analyze convenience samples … The rhetoric of imaginary populations is seductive because it seems to free the investigator from the necessity of understanding how data were generated.

Econometricians should know better than to treat random variables, probabilites and expected values as anything else than things that strictly seen only pertain to statistical models. If they want us take the leap of faith from mathematics into the empirical world in applying the theory, they have to really argue an justify this leap by showing that those neat mathematical assumptions (that, to make things worse, often are left implicit, as e.g. independence and additivity) do not collide with the ugly reality. The set of mathematical assumptions is no validation in itself of the adequacy of the application.

Rigour and elegance in the analysis does not make up for the gap between reality and model. It is the distribution of the phenomena in itself and not its estimation that ought to be at the centre of the stage. A crucial ingredient to any economic theory that wants to use probabilistic models should be a convincing argument for the view that “there can be no harm in considering economic variables as stochastic variables” [Haavelmo 1943:13]. In most cases no such arguments are given.

Of course you are entitled – like Haavelmo and his modern probabilistic followers – to express a hope “at a metaphysical level” that there are invariant features of reality to uncover and that also show up at the empirical level of observations as some kind of regularities.

But is it a justifiable hope? I have serious doubts. The kind of regularities you may hope to find in society is not to be found in the domain of surface phenomena, but rather at the level of causal mechanisms, powers and capacities. Persistence and generality has to be looked out for at an underlying deep level. Most econometricians do not want to visit that playground. They are content with setting up theoretical models that give us correlations and eventually “mimic” existing causal properties.

We have to accept that reality has no “correct” representation in an economic or econometric model. There is no such thing as a “true” model that can capture an open, complex and contextual system in a set of equations with parameters stable over space and time, and exhibiting invariant regularities. To just “believe”, “hope” or “assume” that such a model possibly could exist is not enough. It has to be justified in relation to the ontological conditions of social reality.

In contrast to those who want to give up on (fallible, transient and transformable) “truth” as a relation between theory and reality and content themselves with “truth” as a relation between a model and a probability distribution, I think it is better to really scrutinize if this latter attitude is feasible. To abandon the quest for truth and replace it with sheer positivism would indeed be a sad fate of econometrics. It is more rewarding to stick to truth as a regulatory ideal and keep on searching for theories and models that in relevant and adequate ways express those parts of reality we want to describe and explain.

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that neither Haavelmo, nor the legions of probabilistic econometricians following in his footsteps, give supportive evidence for their considering it “fruitful to believe” in the possibility of treating unique economic data as the observable results of random drawings from an imaginary sampling of an imaginary population. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that econometrics on the whole has not delivered “truth”. And I doubt if it has ever been the intention of its main protagonists.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a more cautious attitude towards probabilistic inference of causality in economic contexts. Science should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts” [Keynes 1971-89 vol XVII:427].  We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour, since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes.

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

This is a more fundamental and radical problem than the celebrated “Lucas critique” have suggested. This is not the question if deep parameters, absent on the macro-level, exist in “tastes” and “technology” on the micro-level. It goes deeper. Real world social systems are not governed by stable causal mechanisms or capacities. It is the criticism that Keynes [1951(1926): 232-33] first launched against econometrics and inferential statistics already in the 1920s:

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of economic theoretical modeling – rather useless.

 
References

Freedman, David (2010), Statistical Models and Causal Inference. Cambridge: Cambridge University Press.

Haavelmo, Trygve (1943), Statistical testing of business-cycle theories. The Review of  Economics and Statistics 25:13-18.

– (1944), The probability approach in econometrics. Supplement to Econometrica 12:1-115.

Hicks, John (1979), Causality in Economics. London: Basil Blackwell.

Keynes, John Maynard (1951 (1926)), Essays in Biography. London: Rupert Hart-Davis.

– (1971-89) The Collected Writings of John Maynard Keynes, vol. I-XXX. D E Moggridge & E A G Robinson (eds), London: Macmillan.

Libertarian paradise

7 May, 2013 at 14:01 | Posted in Politics & Society | 6 Comments

 

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.