## Kalecki on wage-led growth

13 November, 2018 at 11:33 | Posted in Economics | Leave a commentOne of the main features of the capitalist system is the fact that what is to the advantage of a single entrepreneur does not necessarily benefit all entrepreneurs as a class. If one entrepreneur reduces wages he is able ceteris paribus to expand production; but once all entrepreneurs do the same thing — the result will be entirely different.

Let us assume that wages have been in fact generally reduced … and in consequence unemployment vanishes. Has depression thus been overcome? By no means, as the goods produced have still to be sold … A precondition for an equilibrium at this new higher level is that the part of production which is not consumed by workers or by civil servants should be acquired by capitalists for their increased profits; in other words, the capitalists must spend immediately all their additional profits on consumption or investment. It is however most unlikely that this should happen … It is true that increased profitability stimulates investment but this stimulus will not work right away since the entrepreneurs will temporise until they are convinced that higher profitability is going to last … A reduction of wages does not constitute a way out of depression, because the gains are not used immediately by the capitalists for purchase of investment goods.

## Istället för terapi (personal)

12 November, 2018 at 16:00 | Posted in Varia | Leave a commentI alla moderna människors liv behövs det tid för andhämtning och reflektion. Och ibland — när alla möjliga och omöjliga måsten och krav från omgivningen bara blir för många och högljudda — kan det vara skönt att dra sig undan lite grand och slå av på takten för en stund.

Alla har vi väl olika sätt att göra det på. Själv brukar jag gå in på Öppet Arkiv och titta på Gubben i stugan — Nina Hedenius underbart fina dokumentärfilm om den pensionerade skogsarbetaren Ragnars liv i Dalarnas finnskogar.

Enkelt. Vackert. En lisa för själen.

## Kausala modeller och heterogenitet (wonkish)

12 November, 2018 at 13:50 | Posted in Statistics & Econometrics | Leave a commentI *The Book of Why* för Judea Pearl fram flera tunga skäl till varför den numera så populära kausala grafteoretiska ansatsen är att föredra framför mer traditionella regressionsbaserade förklaringsmodeller. Ett av skälen är att kausala grafer är icke-parametriska och därför inte behöver anta exempelvis additivitet och/eller frånvaro av interaktionseffekter — pilar och noder ersätter regressionsanalysens nödvändiga specificeringar av funktionella relationer mellan de i ekvationerna ingående variablerna.

Men även om Pearl och andra av grafteorins anhängare mest framhäver fördelarna med den flexibilitet det nya verktyget ger oss, finns det också klara risker och nackdelar med användandet av kausala grafer. Bristen på klargörande om additivitet, interaktion, eller andra variabel- och relationskaraktäristika föreligger och hur de i så fall specificeras, kan ibland skapa mer problem än de löser.

Många av problemen — precis som med regressionsanalyser — hänger samman med förekomsten och graden av heterogenitet. Låt mig ta ett exempel från skolforskningens område för att belysa problematiken.

En på senare år återkommande fråga som både politiker och forskare ställt sig (se t ex här och här) är om friskolor leder till att höja kunskapsnivå och provresultat bland landets skolelever. För att kunna svara på denna (*realiter* mycket svåra) kausala fråga, behöver vi ha kännedom om mängder av kända, observerbara variabler och bakgrundsfaktorer (föräldrars inkomster och utbildning, etnicitet, boende, etc, etc). Därutöver också faktorer som vi vet har betydelse men är icke-observerbara och/eller mer eller mindre omätbara.

Problemen börjar redan när vi frågar oss vad som döljer sig bakom den allmänna termen ‘friskola’. Alla friskolor är inte likvärdiga (homogenitet). Vi vet att det föreligger många gånger stora skillnader mellan dem (heterogenitet). Att då lumpa ihop alla och försöka besvara den kausala frågan utan att ta hänsyn till dessa skillnader blir många gånger poänglöst och ibland också fullständigt missvisande.

Ett annat problem är att en annan typ av heterogenitet — som har med specifikation av de funktionella relationerna att göra — kan dyka upp. Anta att friskoleeffekten hänger samman med exempelvis etnicitet, och att elever med ‘svensk bakgrund’ presterar bättre än elever med ‘invandrarbakgrund.’ Detta behöver inte nödvändigtvis innebära att elever med olika etnisk bakgrund i sig påverkas olika av att gå på friskola. Effekten kan snarare härröra, exempelvis, ur det faktum att de alternativa kommunala skolor invandrareleverna kunnat gå på varit sämre än de ‘svenska’ elever kunnat gå på. Om man inte tar hänsyn till dessa skillnader i jämförelsegrund blir de skattade friskoleeffekterna missvisande.

Ytterligare heterogenitetsproblem uppstår om de mekanismer som är verksamma vid skapandet av friskoleeffekten ser väsentligt annorlunda ut för olika grupper av elever. Friskolor med ‘fokus’ på invandrargrupper kan exempelvis tänkas vara mer medvetna om behovet av att stötta dessa elever och vidta kompenserande åtgärder för att motarbeta fördomar och dylikt. Utöver effekterna av den (förmodade) bättre undervisningen i övrigt på friskolor är effekterna för denna kategori av elever också en effekt av den påtalade heterogeniteten, och kommer följaktligen inte att sammanfalla med den för den andra gruppen elever.

Tyvärr är det inte slut på problemen här. Vi konfronteras också med ett svårlöst och ofta förbisett selektivitetsproblem. När vi vill försöka få svar på den kausala frågan kring effekterna av friskolor är ett vanligt förfarande i regressionsanalyser att ‘konstanthålla’ eller ‘kontrollera’ för påverkansfaktorer utöver de vi främst är intresserade av. När det gäller friskolor är en vanlig kontrollvariabel föräldrarnas inkomst- eller utbildnings-bakgrund. Logiken är att vi på så vis ska kunna simulera en (ideal) situation som påminner så mycket som möjligt om ett randomiserat experiment där vi bara ‘jämför’ (matchar) elever till föräldrar med jämförbar utbildning eller inkomst, och på så vis hoppas kunna erhålla ett bättre mått på den ‘rena’ friskoleeffekten. Kruxet här är att det inom varje inkomst- och utbildningskategori kan dölja sig ytterligare en – ibland dold och kanske omätbar — heterogenitet som har med exempelvis inställning och motivation att göra och som gör att vissa elever tenderar välja (selektera) att gå på friskolor eftersom de tror sig veta att de kommer att prestera bättre där än på kommunala skolor (i friskoledebatten är ett återkommande argument kring segregationseffekterna att elever till föräldrar med hög ‘socio-ekonomisk status’ här bättre tillgång till information om skolvalets effekter än andra elever). Inkomst- eller utbildningsvariabeln kan på så vis *de facto* ‘maskera’ andra faktorer som ibland kan spela en mer avgörande roll än de. Skattningarna av friskoleeffekten kan därför här — åter — bli missvisande, och ibland till och med ännu mer missvisande än om vi inte ‘konstanthållit’ för någon kontrollvariabel alls (jfr med ‘second-best’ teoremet i välfärdsekonomisk teori)!

Att ‘kontrollera’ för möjliga ‘confounders’ är alltså inte alltid självklart rätt väg att gå. Om själva relationen mellan friskola (X) och studieresultat (Y) påverkas av införandet av kontrollvariabeln ‘socio-ekonomisk status'(W) är detta troligen ett resultat av att det föreligger någon typ av samband mellan X och W. Detta innebär också att vi inte har en ideal ‘experimentsimulering’ eftersom det uppenbarligen finns faktorer som påverkar Y och som inte är slumpmässigt fördelade (randomiserade). Innan vi kan gå vidare måste vi då fråga oss *varför* sambandet i fråga föreligger. För att kunna kausalt förklara sambandet mellan X och Y, måste vi veta mer om hur W påverkar valet av X. Bland annat kan vi då finna att det föreligger en skillnad i valet av X mellan olika delar av gruppen med hög ‘socio-ekonomisk status’ W. Utan kunskaper om denna selektionsmekanism kan vi inte på ett tillförlitligt sätt mäta X:s effekt på Y — den randomiserade förklaringsmodellen är helt enkelt inte applicerbar. Utan kunskap om varför det föreligger ett samband — och hur det ser ut — mellan X och W, hjälper oss inte ‘kontrollerandet’ eftersom det inte tar höjd för den verksamma selektionsmekanismen.

Utöver de här tangerade problemen har vi andra sedan gammalt välkända problem. Den så kallade kontext- eller gruppeffekten — för en elev som går på en friskola kan resultaten delvis vara en effekt av att hennes skolkamrater har liknande bakgrund och att hon därför i någon mening dra fördel av sin omgivning, vilket inte skulle ske om hon gick på en kommunal skola — innebär åter att ’confounder’ eliminering via kontrollvariabler inte självklart fungerar när det föreligger ett samband mellan kontrollvariabel och icke-eller svårmätbara icke-observerbara attribut som själva påverkar den beroende variabeln. I vårt skolexempel kan man anta att de föräldrar med en viss socio-ekonomisk status som skickar sina barn till friskolor skiljer sig från samma grupp av föräldrar som väljer låta barnen gå i kommunal skola. Kontrollvariablerna fungerar — åter igen — inte som fullödiga substitut för ett verkligt experiments randomiserade ’assignment.’

Am I right in thinking that the method of multiple correlation analysis essentially depends on the economist having furnished, not merely a list of the significant causes, which is correct so far as it goes, but a

completelist? For example, suppose three factors are taken into account, it is not enough that these should be in fact vera causa; there must be no other significant factor. If there is a further factor, not taken account of, then the method is not able to discover the relative quantitative importance of the first three. If so, this means that the method is only applicable where the economist is able to provide beforehand a correct and indubitably complete analysis of the significant factors. The method is one neither of discovery nor of criticism. It is a means of giving quantitative precision to what, in qualitative terms, we know already as the result of a complete theoretical analysis …

Vad drar vi för slutsats av allt detta då? Kausalitet *är* svårt och vi ska — trots kritiken — så klart inte kasta ut barnet med badvattnet. Men att inta en hälsosam skepsis och försiktighet när det gäller bedömning och värdering av statistiska metoders — vare sig det gäller kausal grafteori eller mer traditionell regressionsanalys — förmåga att verkligen slå fast kausala relationer, är definitivt att rekommendera.

## Truth and probability

11 November, 2018 at 13:05 | Posted in Theory of Science & Methodology | 1 CommentTruth exists, and so does uncertainty. Uncertainty acknowledges the existence of an underlying truth: you cannot be uncertain of nothing: nothing is the complete absence of

anything. You are uncertain of something, and if there is some thing, there must be truth. At the very least, it is that this thing exists. Probability, which is the science of uncertainty, therefore aims at truth. Probability presupposes truth; it is a measure or characterization of truth. Probability is not necessarily the quantification of the uncertainty of truth, because not all uncertainty is quantifiable. Probability explains the limitations of our knowledge of truth, it never denies it. Probability is purely epistemological, a matter solely of individual understanding. Probability does not existinthings; it is not a substance. Without truth, there could be no probability.

William Briggs’ approach is — as he acknowledges in the preface of his interesting and thought-provoking book — “closely aligned to Keynes’s.”

Almost a hundred years after John Maynard Keynes wrote his seminal *A Treatise on Probability* (1921), it is still very difficult to find statistics textbooks that seriously try to incorporate his far-reaching and incisive analysis of induction and evidential weight.

The standard view in statistics — and the axiomatic probability theory underlying it — is to a large extent based on the rather simplistic idea that ‘more is better.’ But as Keynes argues – ‘more of the same’ is not what is important when making inductive inferences. It’s rather a question of ‘more but different.’

Variation, not replication, is at the core of induction. Finding that p(x|y) = p(x|y & w) doesn’t make w ‘irrelevant.’ Knowing that the probability is unchanged when w is present gives p(x|y & w) another evidential weight (‘weight of argument’). Running 10 replicative experiments do not make you as ‘sure’ of your inductions as when running 10 000 varied experiments – even if the probability values happen to be the same.

According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but ‘rational expectations.’ Keynes rather thinks that we base our expectations on the confidence or ‘weight’ we put on different events and alternatives. To Keynes, expectations are a question of weighing probabilities by ‘degrees of belief,’ beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modelled by “modern” social sciences. And often we ‘simply do not know.’ As Keynes writes in *Treatise*:

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state … In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

Science according to Keynes should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.” Models can never be more than a starting point in that endeavour. He further argued that it was inadmissible to project history onto the future. Consequently, we cannot presuppose that what has worked before, will continue to do so in the future. That statistical models can get hold of correlations between different ‘variables’ is not enough. If they cannot get at the causal structure that generated the data, they are not really ‘identified.’

How strange that writers of statistics textbook, as a rule, do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical ‘probability.’ In the quest for quantities one puts a blind eye to qualities and looks the other way – but Keynes ideas keep creeping out from under the statistics carpet.

It’s high time that statistics textbooks give Keynes his due.

## Text och musik med Eric Schüldt

11 November, 2018 at 11:35 | Posted in Varia | Leave a commentI dessa tider, när ljudrummet dränks i den kommersiella radions tyckmyckentrutade ordbajseri och fullständigt intetsägande pubertalflamsande tjafs, har man nästan gett upp.

Men det finns ljus i mörkret.

I programmet **Text och musik med Eric Schüldt** — som sänds på söndagsförmiddagarna i P2 mellan klockan 11 och 12 — kan man lyssna på seriös musik och en programledare som har något att säga och inte bara låter foderluckan glappa. Att få höra någon med intelligens och känsla tala om saker som vi alla går och bär på djupt inne i våra själar — men nästan aldrig vågar prata om — är en lisa för själen.

I dagens program kunde man bland annat höra detta oändligt vackra stycke musik av Eleni Karaindrou:

Tack Eric för ett fantastiskt program som bara det är värt varenda krona som läggs på radio- och tv-licensen!

## Robert Gordon said it all 40 years ago!

10 November, 2018 at 18:01 | Posted in Economics | 1 CommentWhat is science? One brief definition runs: “A systematic knowledge of the physical or material world.” Most definitions emphasize the two elements in this definition: (1) “systematic knowledge” about (2) the real world. Without pushing this definitional question to its metaphysical limits, I merely want to suggest that if economics is to be a science, it must not only develop analytical tools but must also apply them to a world that is now observable or that can be made observable through improved methods of observation and measurement. Or in the words of the Hungarian mathematical economist Janos Kornai, “In the real sciences, the criterion is not whether the proposition is logically true and tautologically deducible from earlier assumptions. The criterion of ‘truth’ is, whether or not the proposition corresponds to reality” …

One of our most distinguished historians of economic thought, George Stigler, has stated that: “The dominant influence upon the working range of economic theorists is the set of internal values and pressures of the discipline. The subjects of study are posed by the unfolding course of scientific developments.” He goes on to add: “This is not to say that the environment is without influence …” But, he continues, “whether a fact or development is significant depends primarily on its relevance to current economic theory.” What a curious relating of rigor to relevance! Whether the real world matters depends presumably on “its relevance to current economic theory.” Many if not most of today’s economic theorists seem to agree with this ordering of priorities …

Today, rigor competes with relevance in macroeconomic and monetary theory, and in some lines of development macro and monetary theorists, like many of their colleagues in micro theory, seem to consider relevance to be more or less irrelevant … The theoretical analysis in much of this literature rests on assumptions that also fly in the face of the facts … Another related recent development in which theory proceeds with impeccable logic from unrealistic assumptions to conclusions that contradict the historical record, is the recent work on rational expectations …

I have scolded economists for what I think are the sins that too many of them commit, and I have tried to point the way to at least partial redemption. This road to salvation will not be an easy one for those who have been seduced by the siren of mathematical elegance or those who all too often seek to test unrealistic models without much regard for the quality or relevance of the data they feed into their equations. But let us all continue to worship at the altar of science. I ask only that our credo be: “relevance with as much rigor as possible,” and not “rigor regardless of relevance.” And let us not be afraid to ask — and to try to answer the really big questions.

## Good thinking — the thing statistics cannot replace

10 November, 2018 at 16:06 | Posted in Statistics & Econometrics | 4 Comments

As social researchers, we should never equate science with mathematics and statistical calculation. All science entail human judgement, and using mathematical and statistical models don’t relieve us of that necessity. They are no substitutes for thinking and doing real science.

Statistical — and econometric — patterns should never be seen as anything else than possible clues to follow. Behind observable data, there are real structures and mechanisms operating, things that are — if we really want to understand, explain and (possibly) predict things in the real world — more important to get hold of than to simply correlate and regress observable variables.

Statistics cannot establish the truth value of a fact. Never has. Never will.

## Skånska för nybörjare …

10 November, 2018 at 14:53 | Posted in Varia | 2 Comments

Mina egna favoriter är definitivt ‘fubbick’ och ‘ålahue’ — mustiga uttryck som med fördel kan användas när man konfronteras med tyckmyckentrutade imbeciller och estetflabbar …

[PS Och vet du vad ‘tröskemada’ betyder får du gärna meddela mig. DS]

## Take a hard look at the skeletons in the mainstream closet!

9 November, 2018 at 15:32 | Posted in Economics | Leave a commentAlthough prepared to admit that our empirical research procedures may be based on some very shaky assumptions, [some thoughtful scholars see] no point in saying much about this unless superior alternatives are presented. I understand this concern … Nevertheless, a hard look at the skeletons in the closet is beneficial, especially when there is a propensity to keep the door locked. Nothing is gained by avoiding that which the discipline must face up to sooner or later. If a current procedure appears to be patently wrong, I have not hesitated to indicate this, even if the alternatives remain to develop.

Like Stanley Lieberson, those of us in the economics community who are impolite enough to dare to question the preferred methods and models applied in mainstream economics and econometrics are as a rule met with disapproval. But although people seem to get very agitated and upset by the critique, defenders of “received theory” always say that the critique is “nothing new”, that they have always been “well aware” of the problem, “what Syll points out, we all know; there is nothing new in it; the real issue is to find out the alternative,” and so on, and so on.

So, for the benefit of all mindless practitioners of mainstream economics and econometrics — and who don’t want to be disturbed in their doings, eminent mathematical statistician David Freedman put together a very practical list of vacuous responses to criticism that can be freely used to save your peace of mind:

We know all that. Nothing is perfect … The assumptions are reasonable. The assumptions don’t matter. The assumptions are conservative. You can’t prove the assumptions are wrong. The biases will cancel. We can model the biases. We’re only doing what everybody else does. Now we use more sophisticated techniques. If we don’t do it, someone else will. What would you do? The decision-maker has to be better off with us than without us … The models aren’t totally useless. You have to do the best you can with the data. You have to make assumptions in order to make progress. You have to give the models the benefit of the doubt. Where’s the harm?

## Econometrics: The Keynes-Tinbergen controversy

8 November, 2018 at 13:18 | Posted in Statistics & Econometrics | 1 CommentMainstream economists often hold the view that Keynes’ criticism of econometrics was the result of a sadly misinformed and misguided person who disliked and did not understand much of it.

This is, however, nothing but a gross misapprehension.

To be careful and cautious is not the same as to dislike. Keynes did not misunderstand the crucial issues at stake in the development of econometrics. Quite the contrary. He knew them all too well — and was not satisfied with the validity and philosophical underpinning of the assumptions made for applying its methods.

Keynes’ critique is still valid and unanswered in the sense that the problems he pointed at are still with us today and ‘unsolved.’ Ignoring them — the most common practice among applied econometricians — is not to solve them.

To apply statistical and mathematical methods to the real-world economy, the econometrician has to make some quite strong assumptions. In a review of Tinbergen’s econometric work — published in *The Economic Journal* in 1939 — Keynes gave a comprehensive critique of Tinbergen’s work, focusing on the limiting and unreal character of the assumptions that econometric analyses build on:

**Completeness**: Where Tinbergen attempts to specify and quantify which different factors influence the business cycle, Keynes maintains there has to be a complete list of *all* the relevant factors to avoid misspecification and spurious causal claims. Usually, this problem is ‘solved’ by econometricians assuming that they somehow have a ‘correct’ model specification. Keynes is, to put it mildly, unconvinced:

It will be remembered that the seventy translators of the Septuagint were shut up in seventy separate rooms with the Hebrew text and brought out with them, when they emerged, seventy identical translations. Would the same miracle be vouchsafed if seventy multiple correlators were shut up with the same statistical material? And anyhow, I suppose, if each had a different economist perched on his

a priori, that would make a difference to the outcome.

**Homogeneity**: To make inductive inferences possible — and being able to apply econometrics — the system we try to analyse has to have a large degree of ‘homogeneity.’ According to Keynes most social and economic systems — especially from the perspective of real historical time — lack that ‘homogeneity.’ As he had argued already in *Treatise on Probability* (ch. 22), it wasn’t always possible to take repeated samples from a fixed population when we were analysing real-world economies. In many cases, there simply are no reasons at all to assume the samples to be homogenous. Lack of ‘homogeneity’ makes the principle of ‘limited independent variety’ non-applicable, and hence makes inductive inferences, strictly seen, impossible since one of its fundamental logical premises are not satisfied. Without “much repetition and uniformity in our experience” there is no justification for placing “great confidence” in our inductions (TP ch. 8).

And then, of course, there is also the ‘reverse’ variability problem of non-excitation: factors that do not change significantly during the period analysed, can still very well be extremely important causal factors.

**Stability:** Tinbergen assumes there is a stable spatio-temporal relationship between the variables his econometric models analyze. But as Keynes had argued already in his *Treatise on Probability* it was not really possible to make inductive generalisations based on correlations in one sample. As later studies of ‘regime shifts’ and ‘structural breaks’ have shown us, it is exceedingly difficult to find and establish the existence of stable econometric parameters for anything but rather short time series.

**Measurability:** Tinbergen’s model assumes that all relevant factors are measurable. Keynes questions if it is possible to adequately quantify and measure things like expectations and political and psychological factors. And more than anything, he questioned — both on epistemological and ontological grounds — that it was always and everywhere possible to measure real-world uncertainty with the help of probabilistic risk measures. Thinking otherwise can, as Keynes wrote, “only lead to error and delusion.”

**Independence**: Tinbergen assumes that the variables he treats are independent (still a standard assumption in econometrics). Keynes argues that in such a complex, organic and evolutionary system as an economy, independence is a deeply unrealistic assumption to make. Building econometric models from that kind of simplistic and unrealistic assumptions risk producing nothing but spurious correlations and causalities. Real-world economies are organic systems for which the statistical methods used in econometrics are ill-suited, or even, strictly seen, inapplicable. Mechanical probabilistic models have little leverage when applied to non-atomic evolving organic systems — such as economies.

It is a great fault of symbolic pseudo-mathematical methods of formalising a system of economic analysis … that they expressly assume strict independence between the factors involved and lose all their cogency and authority if this hypothesis is disallowed; whereas, in ordinary discourse, where we are not blindly manipulating but know all the time what we are doing and what the words mean, we can keep “at the back of our heads” the necessary reserves and qualifications and the adjustments which we shall have to make later on, in a way in which we cannot keep complicated partial differentials “at the back” of several pages of algebra which assume that they all vanish.

Building econometric models can’t be a goal in itself. Good econometric models are means that make it possible for us to infer things about the real-world systems they ‘represent.’ If we can’t show that the mechanisms or causes that we isolate and handle in our econometric models are ‘exportable’ to the real world, they are of limited value to our understanding, explanations or predictions of real-world economic systems.

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be much less simple than the bare principle of uniformity. They appear to assume something much more like what mathematicians call the principle of the superposition of small effects, or, as I prefer to call it, in this connection, the

atomiccharacter of natural law. The system of the material universe must consist, if this kind of assumption is warranted, of bodies which we may term (without any implication as to their size being conveyed thereby)legal atoms, such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state …The scientist wishes, in fact, to assume that the occurrence of a phenomenon which has appeared as part of a more complex phenomenon, may be some reason for expecting it to be associated on another occasion with part of the same complex. Yet if different wholes were subject to laws

quawholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts.

**Linearity:** To make his models tractable, Tinbergen assumes the relationships between the variables he study to be linear. This is still standard procedure today, but as Keynes writes:

It is a very drastic and usually improbable postulate to suppose that all economic forces are of this character, producing independent changes in the phenomenon under investigation which are directly proportional to the changes in themselves; indeed, it is ridiculous.

To Keynes, it was a ‘fallacy of reification’ to assume that all quantities are additive (an assumption closely linked to independence and linearity).

The unpopularity of the principle of organic unities shows very clearly how great is the danger of the assumption of unproved additive formulas. The fallacy, of which ignorance of organic unity is a particular instance, may perhaps be mathematically represented thus: suppose f(x) is the goodness of x and f(y) is the goodness of y. It is then assumed that the goodness of x and y together is f(x) + f(y) when it is clearly f(x + y) and only in special cases will it be true that f(x + y) = f(x) + f(y). It is plain that it is never legitimate to assume this property in the case of any given function without proof.

J. M. Keynes “Ethics in Relation to Conduct” (1903)

And as even one of the founding fathers of modern econometrics — Trygve Haavelmo — wrote:

What is the use of testing, say, the significance of regression coefficients, when maybe, the whole assumption of the linear regression equation is wrong?

Real-world social systems are usually not governed by stable causal mechanisms or capacities. The kinds of ‘laws’ and relations that econometrics has established, are laws and relations about entities in models that presuppose causal mechanisms and variables — and the relationship between them — being linear, additive, homogenous, stable, invariant and atomistic. But — when causal mechanisms operate in the real world they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. Since statisticians and econometricians — as far as I can see — haven’t been able to convincingly warrant their assumptions of homogeneity, stability, invariance, independence, additivity as being ontologically isomorphic to real-world economic systems, Keynes’ critique is still valid. As long as — as Keynes writes in a letter to Frisch in 1935 — “nothing emerges at the end which has not been introduced expressively or tacitly at the beginning,” I remain doubtful of the scientific aspirations of econometrics.

In his critique of Tinbergen, Keynes points us to the fundamental logical, epistemological and ontological problems of applying statistical methods to a basically unpredictable, uncertain, complex, unstable, interdependent, and ever-changing social reality. Methods designed to analyse repeated sampling in controlled experiments under fixed conditions are not easily extended to an organic and non-atomistic world where time and history play decisive roles.

Econometric modelling should never be a substitute for thinking. From that perspective, it is really depressing to see how much of Keynes’ critique of the pioneering econometrics in the 1930s-1940s is still relevant today.

The general line you take is interesting and useful. It is, of course, not exactly comparable with mine. I was raising the logical difficulties. You say in effect that, if one was to take these seriously, one would give up the ghost in the first lap, but that the method, used judiciously as an aid to more theoretical enquiries and as a means of suggesting possibilities and probabilities rather than anything else, taken with enough grains of salt and applied with superlative common sense, won’t do much harm. I should quite agree with that. That is how the method ought to be used.

Keynes, letter to E.J. Broster, December 19, 1939

## The biggest clown in American politics ever caught in action

7 November, 2018 at 21:27 | Posted in Varia | Leave a comment

## Richard Feynman om mathematics

7 November, 2018 at 20:18 | Posted in Economics, Theory of Science & Methodology | 1 CommentIn a comment on one of yours truly’s posts last week, Jorge Buzaglo wrote this truly interesting comment:

Nobel prize winner Richard Feynman on the use of mathematics:

“Mathematicians, or people who have very mathematical minds, are often led astray when “studying” economics because they lose sight of the economics. They say: ‘Look, these equations … are all there is to economics; it is admitted by the economists that there is nothing which is not contained in the equations.The equations are complicated, but after all they are only mathematical equations and if I understand them mathematically inside out, I will understand the economics inside out.’ Only it doesn’t work that way. Mathematicians who study economics with that point of view — and there have been many of them — usually make little contribution to economics and, in fact, little to mathematics. They fail because the actual economic situations in the real world are so complicated that it is necessary to have a much broader understanding of the equations.

“I have replaced the word “physics” (and similar) by the word “economics” (and similar) in this quote from Page 2-1 in: R. Feynman, R. Leighton and M. Sands, The Feynman Lectures on Physics, Volume II, Addison-Wesley Publishing, Reading, 1964,

## Film som verkligen betyder något (personal)

6 November, 2018 at 17:05 | Posted in Varia | Leave a comment

Filmer kan beröra oss på många olika sätt. Många är mest inget annat än rent tidsfördriv och eskapism. Men det finns också några — få — filmer som verkligen betyder något. De riktigt *stora* filmerna. De som på allvar tränger in under huden och skakar om oss i vårt innersta.

Kjell-Åke Anderssons filmatisering av Göran Tunströms mästerverk *Juloratoriet* — med gudabenådad musik av Stefan Nilsson — är en sådan film. Med starka beröringspunkter till mitt eget liv, är den en av de sorgligaste filmer jag vet. Men kanske också den allra vackraste. Den om kärlekens oändliga styrka och kraft.

## Calibration — an economics fraud kit

6 November, 2018 at 09:03 | Posted in Economics | 1 CommentIn his well-written and interesting article *The Trouble with Macroeconomics*, Paul Romer goes to a frontal attack on the theories that have put macroeconomics on a path of ‘intellectual regress’ for three decades now:

Macroeconomists got comfortable with the idea that fluctuations in macroeconomic aggregates are caused by imaginary shocks, instead of actions that people take, after Kydland and Prescott (1982) launched the real business cycle (RBC) model …

In response to the observation that the shocks are imaginary, a standard defence invokes Milton Friedman’s (1953) methodological assertion from unnamed authority that “the more significant the theory, the more unrealistic the assumptions.” More recently, “all models are false” seems to have become the universal hand-wave for dismissing any fact that does not conform to the model that is the current favourite.

The noncommittal relationship with the truth revealed by these methodological evasions and the “less than totally convinced …” dismissal of fact goes so far beyond post-modern irony that it deserves its own label. I suggest “post-real.”

There are many kinds of useless ‘post-real’ economics held in high regard within mainstream economics establishment today. Few — if any — are less deserved than the macroeconomic theory/method — mostly connected with Nobel laureates Finn Kydland, Robert Lucas, Edward Prescott and Thomas Sargent — called calibration.

In physics, it may possibly not be straining credulity too much to model processes as ergodic – where time and history do not really matter – but in social and historical sciences it is obviously ridiculous. If societies and economies were ergodic worlds, why do econometricians fervently discuss things such as structural breaks and regime shifts? That they do is an indication of the unrealisticness of treating open systems as analyzable with ergodic concepts.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. Reading Lucas, Sargent, Prescott, Kydland and other calibrationists one comes to think of Robert Clower’s apt remark that

much economics is so far removed from anything that remotely resembles the real world that it’s often difficult for economists to take their own subject seriously.

Instead of assuming calibration and rational expectations to be right, one ought to confront the hypothesis with the available evidence. It is not enough to construct models. Anyone can construct models. To be seriously interesting, models have to come with an aim. They have to have an intended use. If the intention of calibration and rational expectations is to help us explain real economies, it has to be evaluated from that perspective. A model or hypothesis without a specific applicability is not really deserving our interest.

Without strong evidence, all kinds of absurd claims and nonsense may pretend to be science. We have to demand more of a justification than this rather watered-down version of ‘anything goes’ when it comes to rationality postulates. If one proposes rational expectations one also has to support its underlying assumptions. None is given, which makes it rather puzzling how rational expectations has become the standard modelling assumption made in much of modern macroeconomics. Perhaps the reason is that economists often mistake mathematical beauty for truth.

In the hands of Lucas, Prescott and Sargent, rational expectations have been transformed from an – in principle – testable hypothesis to an irrefutable proposition. Believing in a set of irrefutable propositions may be comfortable – like religious convictions or ideological dogmas – but it is not science.

So where does this all lead us? What is the trouble ahead for economics? Putting a sticky-price DSGE lipstick on the RBC pig sure won’t do. Neither will just looking the other way and pretend it’s raining.

## Happy Birthday

5 November, 2018 at 16:58 | Posted in Varia | Leave a commentFick den här urtjusiga kaffemuggen av mina kära döttrar idag på födelsedagen. De påstår det är en bild av fadern för trettio år sedan på den …

Create a free website or blog at WordPress.com.

Entries and comments feeds.