The New Classical counterrevolution​

17 Nov, 2018 at 09:52 | Posted in Economics | 3 Comments

scrrewIn a post on his blog, Oxford macroeconomist Simon Wren-Lewis discusses if modern academic macroeconomics is eclectic or not. When it comes to methodology it seems as though his conclusion is that it is not:

The New Classical Counter Revolution of the 1970s and 1980s … was primarily a revolution about methodology, about arguing that all models should be microfounded, and in terms of mainstream macro it was completely successful … Mainstream academic macro is very eclectic in the range of policy questions it can address, and conclusions it can arrive at, but in terms of methodology it is quite the opposite.

In an earlier post he elaborated on why the New Classical Counterrevolution was so successful in replacing older theories, despite the fact that the New Classical models weren’t able to explain what happened to output and inflation in the 1970s and 1980s:

The new theoretical ideas New Classical economists brought to the table were impressive, particularly to those just schooled in graduate micro. Rational expectations is the clearest example …

If mainstream academic macroeconomists were seduced by anything, it was a methodology — a way of doing the subject which appeared closer to what at least some of their microeconomic colleagues were doing at the time, and which was very different to the methodology of macroeconomics before the New Classical Counterrevolution. The old methodology was eclectic and messy, juggling the competing claims of data and theory. The new methodology was rigorous!

Wren-Lewis seems to be impressed by the ‘rigour’ brought to macroeconomics by the New Classical counterrevolution and its rational expectations, microfoundations and ‘Lucas Critique’.

I fail to see why.

Wren-Lewis’ portrayal of rational expectations is not as innocent as it may look. Rational expectations in the mainstream economists’ world imply that relevant distributions have to be time independent. This amounts to assuming that an economy is a closed system with known stochastic probability distributions for all different events. In reality, it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds since an economy can hardly be conceived as being completely replicated over time. The similarity between these modelling assumptions and the expectations of real persons is vanishingly small. In the world of the rational expectations hypothesis, we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents ‘make predictable errors’ in the New Keynesian models doesn’t take us a closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

‘Rigorous’ and ‘precise’ New Classical or ‘New Keynesian’ models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

The failure in the attempt to anchor the analysis in the alleged stable deep parameters ‘tastes’ and ‘technology’ shows that if you neglect ontological considerations pertaining to real-world economies, ultimately reality gets its revenge when at last questions of bridging and exportation of model exercises are laid on the table.

keynes-right-and-wrong

Mainstream economists are proud of having an ever-growing smorgasbord of models to cherry-pick from (as long as, of course, the models do not question the standard modelling strategy) when performing their analyses. The ‘rigorous’ and ‘precise’ deductions made in these closed models, however, are not in any way matched by a similar stringency or precision when it comes to what ought to be the most important stage of any research — making statements and explaining things in real economies. Although almost every mainstream economist holds the view that thought-experimental modelling has to be followed by confronting the models with reality — which is what they indirectly want to predict/explain/understand using their models — they all of a sudden become exceedingly vague and imprecise. It is as if all the intellectual force has been invested in the modelling stage and nothing is left for what really matters — what exactly do these models teach us about real economies.

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimetre if they do not stand the acid test of relevance to the target.

Proving things ‘rigorously’ in mathematical models is at most a starting point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

My favourite girls (personal)

15 Nov, 2018 at 16:33 | Posted in Varia | Comments Off on My favourite girls (personal)

 
my girls

Hedda (5), Linnea (19), and Tora (25)

Being a class mongrel

15 Nov, 2018 at 14:05 | Posted in Politics & Society | 3 Comments

bragg We were working class, and you don’t lose that. Later on, I bolted on middle classness but I think the working-class thing hasn’t gone away and it never will go away. Quite a few of my interactions and responses are still the responses I had when I was 18 or 19. And the other things are bolted on and it is a mix. It is what it is, and a lot of people are like that. I’m a class mongrel.

Melvyn Bragg

Most people think of social mobility as something unproblematically​ positive. Sharing much the same experience as the one Bragg describes, it is difficult to share that sentiment. Becoming — basically through educational prowess — part of the powers and classes that for centuries have oppressed and belittled the working classes can be a rather mixed experience. As a rags-to-riches traveller,​ you always find yourself somewhere in between​ the world you are leaving and the world you are entering. Moving up the social ladder does not erase your past. Forgetting that, rest assured there are others that are more than happy to remind you. The social mobility many of us who grew​ up in the 50’s and 60’s experienced, only underscores that the real freedom of the working classes has to transcend the individual. It has to be a collective endeavour, whereby we rise with our class and not out of it.

Kalecki and Keynes on the loanable funds fallacy

14 Nov, 2018 at 23:17 | Posted in Economics | 8 Comments

kal It should be emphasized that the equality between savings and investment … will be valid under all circumstances. In particular, it will be independent of the level of the rate of interest which was customarily considered in economic theory to be the factor equilibrating the demand for and supply of new capital. In the present conception investment, once carried out, automatically provides the savings necessary to finance it. Indeed, in our simplified model, profits in a given period are the direct outcome of capitalists’ consumption and investment in that period. If investment increases by a certain amount, savings out of profits are pro tanto higher …

One important consequence of the above is that the rate of interest cannot be determined by the demand for and supply of new capital because investment ‘finances itself.’

The loanable funds theory is in many regards nothing but an approach where the ruling rate of interest in society is — pure and simple — conceived as nothing else than the price of loans or credits set by banks and determined by supply and demand — as Bertil Ohlin put it — “in the same way as the price of eggs and strawberries on a village market.”

loanIt is a beautiful fairy tale, but the problem is that banks are not barter institutions that transfer pre-existing loanable funds from depositors to borrowers. Why? Because, in the real world, there simply are no pre-existing loanable funds. Banks create new funds — credit — only if someone has previously got into debt! Banks are monetary institutions, not barter vehicles.

In the traditional loanable funds theory — as presented in mainstream macroeconomics textbooks — the amount of loans and credit available for financing investment is constrained by how much saving is available. Saving is the supply of loanable funds, investment is the demand for loanable funds and assumed to be negatively related to the interest rate. Lowering households’ consumption means increasing savings via a lower interest.

That view has been shown to have very little to do with reality. It’s nothing but an otherworldly neoclassical fantasy. But there are many other problems as well with the standard presentation and formalization of the loanable funds theory:

As already noticed by James Meade decades ago, the causal story told to explicate the accounting identities used gives the picture of “a dog called saving wagged its tail labelled investment.” In Keynes’s view — and later over and over again confirmed by empirical research — it’s not so much the interest rate at which firms can borrow that causally determines the amount of investment undertaken, but rather their internal funds, profit expectations and capacity utilization.

As is typical of most mainstream macroeconomic formalizations and models, there is pretty little mention of real-world phenomena, like e. g. real money, credit rationing and the existence of multiple interest rates, in the loanable funds theory. Loanable funds theory essentially reduces modern monetary economies to something akin to barter systems — something they definitely are not. As emphasized especially by Minsky, to understand and explain how much investment/loaning/crediting is going on in an economy, it’s much more important to focus on the working of financial markets than staring at accounting identities like S = Y – C – G. The problems we meet on modern markets today have more to do with inadequate financial institutions than with the size of loanable-funds-savings.

The loanable funds theory in the ‘New Keynesian’ approach means that the interest rate is endogenized by assuming that Central Banks can (try to) adjust it in response to an eventual output gap. This, of course, is essentially nothing but an assumption of Walras’ law being valid and applicable, and that a fortiori the attainment of equilibrium is secured by the Central Banks’ interest rate adjustments. From a realist Keynes-Minsky point of view, this can’t be considered anything else than a belief resting on nothing but sheer hope. [Not to mention that more and more Central Banks actually choose not to follow Taylor-like policy rules.] The age-old belief that Central Banks control the money supply has more an more come to be questioned and replaced by an ‘endogenous’ money view, and I think the same will happen to the view that Central Banks determine “the” rate of interest.

A further problem in the traditional loanable funds theory is that it assumes that saving and investment can be treated as independent entities. This is seriously wrong:

gtThe classical theory of the rate of interest [the loanable funds theory] seems to suppose that, if the demand curve for capital shifts or if the curve relating the rate of interest to the amounts saved out of a given income shifts or if both these curves shift, the new rate of interest will be given by the point of intersection of the new positions of the two curves. But this is a nonsense theory. For the assumption that income is constant is inconsistent with the assumption that these two curves can shift independently of one another. If either of them shifts​, then, in general, income will change; with the result that the whole schematism based on the assumption of a given income breaks down … In truth, the classical theory has not been alive to the relevance of changes in the level of income or to the possibility of the level of income being actually a function of the rate of the investment.

There are always (at least) two parts in an economic transaction. Savers and investors have different liquidity preferences and face different choices — and their interactions usually only take place intermediated by financial institutions. This, importantly, also means that there is no ‘direct and immediate’ automatic interest mechanism at work in modern monetary economies. What this ultimately boils done to is — iter — that what happens at the microeconomic level — both in and out of equilibrium —  is not always compatible with the macroeconomic outcome. The fallacy of composition (the ‘atomistic fallacy’ of Keynes) has many faces — loanable funds is one of them.

Contrary to the loanable funds theory, finance in the world of Keynes and Minsky precedes investment and saving. Highlighting the loanable funds fallacy, Keynes wrote in “The Process of Capital Formation” (1939):

Increased investment will always be accompanied by increased saving, but it can never be preceded by it. Dishoarding and credit expansion provides not an alternative to increased saving, but a necessary preparation for it. It is the parent, not the twin, of increased saving.

What is ‘forgotten’ in the loanable funds theory, is the insight that finance — in all its different shapes — has its own dimension, and if taken seriously, its effect on an analysis must modify the whole theoretical system and not just be added as an unsystematic appendage. Finance is fundamental to our understanding of modern economies​ and acting like the baker’s apprentice who, having forgotten to add yeast to the dough, throws it into the oven afterwards, simply isn’t enough.

All real economic activities nowadays depend on a functioning financial machinery. But institutional arrangements, states of confidence, fundamental uncertainties, asymmetric expectations, the banking system, financial intermediation, loan granting processes, default risks, liquidity constraints, aggregate debt, cash flow fluctuations, etc., etc. — things that play decisive roles in channelling money/savings/credit — are more or less left in the dark in modern formalizations of the loanable funds theory.

So, yes, the ‘secular stagnation’ will be over, as soon as we free ourselves from the loanable funds theory — and scholastic gibbering about ZLB — and start using good old Keynesian fiscal policies.

In search of causality

14 Nov, 2018 at 14:41 | Posted in Statistics & Econometrics | 3 Comments

dilbert

One of the few statisticians that yours truly have on the blogroll is Andrew Gelman. Although not sharing his Bayesian leanings, I find his open-minded, thought-provoking and non-dogmatic statistical thinking highly recommendable. The plaidoyer below for ‘reverse causal questioning’ is typical Gelmanian:

When statistical and econometrc methodologists write about causal inference, they generally focus on forward causal questions. We are taught to answer questions of the type “What if?”, rather than “Why?” Following the work by Rubin (1977) causal questions are typically framed in terms of manipulations: if x were changed by one unit, how much would y be expected to change? But reverse causal questions are important too … In many ways, it is the reverse causal questions that motivate the research, including experiments and observational studies, that we use to answer the forward questions …

Reverse causal reasoning is different; it involves asking questions and searching for new variables that might not yet even be in our model. We can frame reverse causal questions as model checking. It goes like this: what we see is some pattern in the world that needs an explanation. What does it mean to “need an explanation”? It means that existing explanations — the existing model of the phenomenon — does not do the job …

By formalizing reverse casual reasoning within the process of data analysis, we hope to make a step toward connecting our statistical reasoning to the ways that we naturally think and talk about causality. This is consistent with views such as Cartwright (2007) that causal inference in reality is more complex than is captured in any theory of inference … What we are really suggesting is a way of talking about reverse causal questions in a way that is complementary to, rather than outside of, the mainstream formalisms of statistics and econometrics.

In a time when scientific relativism is expanding, it is important to keep up the claim for not reducing science to a pure discursive level. We have to maintain the Enlightenment tradition of thinking of reality as principally independent of our views of it and of the main task of science as studying the structure of this reality. Perhaps the most important contribution a researcher can make is revealing what this reality that is the object of science actually looks like.

Science is made possible by the fact that there are structures that are durable and are independent of our knowledge or beliefs about them. There exists a reality beyond our theories and concepts of it. It is this independent reality that our theories in some way deal with. Contrary to positivism, I would as a critical realist argue that the main task of science is not to detect event-regularities between observed facts. Rather, that task must be conceived as identifying the underlying structures and forces that produce the observed events.

mcgregor4_clip_image002_0000

In Gelman’s essay there is  no explicit argument for abduction —  inference to the best explanation — but I would still argue that it is de facto nothing but a very strong argument for why scientific realism and inference to the best explanation are the best alternatives for explaining what is going on in the world we live in. The focus on causality, model checking, anomalies and context-dependence — although here expressed in statistical terms — is as close to abductive reasoning as we get in statistics and econometrics today.

Kalecki on wage-led growth

13 Nov, 2018 at 11:33 | Posted in Economics | Comments Off on Kalecki on wage-led growth

One of the main features of the capitalist system is the fact that what is to the advantage of a single entrepreneur does not necessarily benefit all entrepreneurs as a class. If one entrepreneur reduces wages he is able ceteris paribus to expand production; but once all entrepreneurs do the same thing — the result will be entirely different.

kaleckiLet us assume that wages have been in fact generally reduced … and in consequence unemployment vanishes. Has depression thus been overcome? By no means, as the goods produced have still to be sold … A precondition for an equilibrium at this new higher level is that the part of production which is not consumed by workers or by civil servants should be acquired by capitalists for their increased profits; in other words, the capitalists must spend immediately all their additional profits on consumption or investment. It is however most unlikely that this should happen … It is true that increased profitability stimulates investment but this stimulus will not work right away since the entrepreneurs will temporise until they are convinced that higher profitability is going to last … A reduction of wages does not constitute a way out of depression, because the gains are not used immediately by the capitalists for purchase of investment goods.

The power of self-belief …

13 Nov, 2018 at 11:24 | Posted in Varia | 3 Comments

 
ego

Kausala modeller och heterogenitet (wonkish)

12 Nov, 2018 at 13:50 | Posted in Statistics & Econometrics | Comments Off on Kausala modeller och heterogenitet (wonkish)

The Book of Why_coverI The Book of Why för Judea Pearl fram flera tunga skäl till varför den numera så populära kausala grafteoretiska ansatsen är att föredra framför mer traditionella regressionsbaserade förklaringsmodeller. Ett av skälen är att kausala grafer är icke-parametriska och därför inte behöver anta exempelvis additivitet och/eller frånvaro av interaktionseffekter — pilar och noder ersätter regressionsanalysens nödvändiga specificeringar av funktionella relationer mellan de i ekvationerna ingående variablerna.

Men även om Pearl och andra av grafteorins anhängare mest framhäver fördelarna med den flexibilitet det nya verktyget ger oss, finns det också klara risker och nackdelar med användandet av kausala grafer. Bristen på klargörande om additivitet, interaktion, eller andra variabel- och relationskaraktäristika föreligger och hur de i så fall specificeras, kan ibland skapa mer problem än de löser.

Många av problemen — precis som med regressionsanalyser — hänger samman med förekomsten och graden av heterogenitet. Låt mig ta ett exempel från skolforskningens område för att belysa problematiken.

En på senare år återkommande fråga som både politiker och forskare ställt sig (se t ex här och här) är om friskolor leder till att höja kunskapsnivå och provresultat bland landets skolelever. För att kunna svara på denna (realiter mycket svåra) kausala fråga, behöver vi ha kännedom om mängder av kända, observerbara variabler och bakgrundsfaktorer (föräldrars inkomster och utbildning, etnicitet, boende, etc, etc). Därutöver också faktorer som vi vet har betydelse men är icke-observerbara och/eller mer eller mindre omätbara.

Problemen börjar redan när vi frågar oss vad som döljer sig bakom den allmänna termen ‘friskola’. Alla friskolor är inte likvärdiga (homogenitet). Vi vet att det föreligger många gånger stora skillnader mellan dem (heterogenitet). Att då lumpa ihop alla och försöka besvara den kausala frågan utan att ta hänsyn till dessa skillnader blir många gånger poänglöst och ibland också fullständigt missvisande.

Ett annat problem är att en annan typ av heterogenitet — som har med specifikation av de funktionella relationerna att göra — kan dyka upp. Anta att friskoleeffekten hänger samman med exempelvis etnicitet, och att elever med ‘svensk bakgrund’ presterar bättre än elever med ‘invandrarbakgrund.’ Detta behöver inte nödvändigtvis innebära att elever med olika etnisk bakgrund i sig påverkas olika av att gå på friskola. Effekten kan snarare härröra, exempelvis, ur det faktum att de alternativa kommunala skolor invandrareleverna kunnat gå på varit sämre än de ‘svenska’ elever kunnat gå på. Om man inte tar hänsyn till dessa skillnader i jämförelsegrund blir de skattade friskoleeffekterna missvisande.

Ytterligare heterogenitetsproblem uppstår om de mekanismer som är verksamma vid skapandet av friskoleeffekten ser väsentligt annorlunda ut för olika grupper av elever. Friskolor med ‘fokus’ på invandrargrupper kan exempelvis tänkas vara mer medvetna om behovet av att stötta dessa elever och vidta kompenserande åtgärder för att motarbeta fördomar och dylikt. Utöver effekterna av den (förmodade) bättre undervisningen i övrigt på friskolor är effekterna för denna kategori av elever också en effekt av den påtalade heterogeniteten, och kommer följaktligen inte att sammanfalla med den för den andra gruppen elever.

Tyvärr är det inte slut på problemen här. Vi konfronteras också med ett svårlöst och ofta förbisett selektivitetsproblem. När vi vill försöka få svar på den kausala frågan kring effekterna av friskolor är ett vanligt förfarande i regressionsanalyser att ‘konstanthålla’ eller ‘kontrollera’ för påverkansfaktorer utöver de vi främst är intresserade av. När det gäller friskolor är en vanlig kontrollvariabel föräldrarnas inkomst- eller utbildnings-bakgrund. Logiken är att vi på så vis ska kunna simulera en (ideal) situation som påminner så mycket som möjligt om ett randomiserat experiment där vi bara ‘jämför’ (matchar) elever till föräldrar med jämförbar utbildning eller inkomst, och på så vis hoppas kunna erhålla ett bättre mått på den ‘rena’ friskoleeffekten. Kruxet här är att det inom varje inkomst- och utbildningskategori kan dölja sig ytterligare en – ibland dold och kanske omätbar — heterogenitet som har med exempelvis inställning och motivation att göra och som gör att vissa elever tenderar välja (selektera) att gå på friskolor eftersom de tror sig veta att de kommer att prestera bättre där än på kommunala skolor (i friskoledebatten är ett återkommande argument kring segregationseffekterna att elever till föräldrar med hög ‘socio-ekonomisk status’ här bättre tillgång till information om skolvalets effekter än andra elever). Inkomst- eller utbildningsvariabeln kan på så vis de facto  ‘maskera’ andra faktorer som ibland kan spela en mer avgörande roll än de. Skattningarna av friskoleeffekten kan därför här — åter — bli missvisande, och ibland till och med ännu mer missvisande än om vi inte ‘konstanthållit’ för någon kontrollvariabel alls (jfr med ‘second-best’ teoremet i välfärdsekonomisk teori)!

Att ‘kontrollera’ för möjliga ‘confounders’ är alltså inte alltid självklart rätt väg att gå. Om själva relationen mellan friskola (X) och studieresultat (Y) påverkas av införandet av  kontrollvariabeln ‘socio-ekonomisk status'(W) är detta troligen ett resultat av att det föreligger någon typ av samband mellan X och W. Detta  innebär också att vi inte har en  ideal ‘experimentsimulering’ eftersom det uppenbarligen finns faktorer som påverkar Y och som inte är slumpmässigt fördelade (randomiserade). Innan vi kan gå vidare måste vi då fråga oss varför sambandet i fråga föreligger. För att kunna kausalt förklara sambandet mellan X och Y, måste vi veta mer om hur W påverkar valet av X. Bland annat kan vi då finna att det föreligger en skillnad i valet av X mellan olika delar av  gruppen med hög ‘socio-ekonomisk status’ W. Utan kunskaper om denna selektionsmekanism kan vi inte  på ett tillförlitligt sätt mäta X:s effekt på Y — den randomiserade förklaringsmodellen är helt enkelt inte applicerbar. Utan kunskap om varför det föreligger ett samband  — och hur det ser ut — mellan X och W, hjälper oss inte ‘kontrollerandet’ eftersom det inte tar höjd för den verksamma selektionsmekanismen.

Utöver de här tangerade problemen har vi andra sedan gammalt välkända problem. Den så kallade kontext- eller gruppeffekten — för en elev som går på en friskola kan resultaten delvis vara en effekt av att hennes skolkamrater har liknande bakgrund och att hon därför i någon mening dra fördel av sin omgivning, vilket inte skulle ske om hon gick på en kommunal skola — innebär åter att ’confounder’ eliminering via kontrollvariabler inte självklart fungerar när det föreligger ett samband mellan kontrollvariabel och icke-eller svårmätbara icke-observerbara attribut som själva påverkar den beroende variabeln. I vårt skolexempel kan man anta att de föräldrar med en viss socio-ekonomisk status som skickar sina barn till friskolor skiljer sig från samma grupp av föräldrar som väljer låta barnen gå i kommunal skola. Kontrollvariablerna fungerar — åter igen — inte som fullödiga  substitut för ett verkligt experiments randomiserade ’assignment.’

Am I right in thinking that the method of multiple correlation analysis essentially depends on the economist having furnished, not merely a list of the significant causes, which is correct so far as it goes, but a complete list? For example, suppose three factors are taken into account, it is not enough that these should be in fact vera causa; there must be no other significant factor. If there is a further factor, not taken account of, then the method is not able to discover the relative quantitative importance of the first three. If so, this means that the method is only applicable where the economist is able to provide beforehand a correct and indubitably complete analysis of the significant factors. The method is one neither of discovery nor of criticism. It is a means of giving quantitative precision to what, in qualitative terms, we know already as the result of a complete theoretical analysis …

John Maynard Keynes

Vad avser användandet av kontrollvariabler får man inte heller bortse från en viktig aspekt som sällan berörs av de som använder berörda statistiska metoder. De i studierna ingående variablerna behandlas ‘som om’ relationerna mellan dem i populationen är slumpmässig. Men variabler kan ju de facto ha de värden de har just för att de ger upphov till de konsekvenser de har. Utfallet bestämmer på så vis alltså i viss utsträckning varför de ‘oberoende’ variablerna har de värden de har. De ”randomiserade’ oberoende variablerna visar sig i själva verket vara något annat än vad de antas vara, och omöjliggör därför också att observationsstudierna och kvasiexperimenten ens är i närheten av att vara riktiga experiment. Saker och ting ser ut som de gör många gånger av ett skäl. Ibland är skälen just de konsekvenser regler, institutioner och andra faktorer anteciperas ge upphov till! Det som uppfattas som ‘exogent’ är i själva verket inte alls ‘exogent’

Those variables that have been left outside of the causal system may not actually operate as assumed; they may produce effects that are nonrandom and that may become confounded with those of the variables directly under consideration.

Hubert Blalock

Vad drar vi för slutsats av allt detta då? Kausalitet är svårt och vi ska — trots kritiken — så klart inte kasta ut barnet med badvattnet. Men att inta en hälsosam skepsis och försiktighet när det gäller bedömning och värdering av statistiska metoders — vare sig det gäller kausal grafteori eller mer traditionell regressionsanalys — förmåga att verkligen slå fast kausala relationer, är definitivt att rekommendera.

Truth and probability

11 Nov, 2018 at 13:05 | Posted in Theory of Science & Methodology | 1 Comment

uncertainty-7Truth exists, and so does uncertainty. Uncertainty acknowledges the existence of an underlying truth: you cannot be uncertain of nothing: nothing is the complete absence of anything. You are uncertain of something, and if there is some thing, there must be truth. At the very least, it is that this thing exists. Probability, which is the science of uncertainty, therefore aims at truth. Probability presupposes truth; it is a measure or characterization of truth. Probability is not necessarily the quantification of the uncertainty of truth, because not all uncertainty is quantifiable. Probability explains the limitations of our knowledge of truth, it never denies it. Probability is purely epistemological, a matter solely of individual understanding. Probability does not exist in things; it is not a substance. Without truth, there could be no probability.

William Briggs’ approach is — as he acknowledges in the preface of his interesting and thought-provoking book — “closely aligned to Keynes’s.”

Almost a hundred years after John Maynard Keynes wrote his seminal A Treatise on Probability (1921), it is still very difficult to find statistics textbooks that seriously try to incorporate his far-reaching and incisive analysis of induction and evidential weight.

The standard view in statistics — and the axiomatic probability theory underlying it — is to a large extent based on the rather simplistic idea that ‘more is better.’ But as Keynes argues – ‘more of the same’ is not what is important when making inductive inferences. It’s rather a question of ‘more but different.’

Variation, not replication, is at the core of induction. Finding that p(x|y) = p(x|y & w) doesn’t make w ‘irrelevant.’ Knowing that the probability is unchanged when w is present gives p(x|y & w) another evidential weight (‘weight of argument’). Running 10 replicative experiments do not make you as ‘sure’ of your inductions as when running 10 000 varied experiments – even if the probability values happen to be the same.

According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but ‘rational expectations.’ Keynes rather thinks that we base our expectations on the confidence or ‘weight’ we put on different events and alternatives. To Keynes, expectations are a question of weighing probabilities by ‘degrees of belief,’ beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modelled by “modern” social sciences. And often we ‘simply do not know.’ As Keynes writes in Treatise:

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state …  In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

Science according to Keynes should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.” Models can never be more than a starting point in that endeavour. He further argued that it was inadmissible to project history onto the future. Consequently, we cannot presuppose that what has worked before, will continue to do so in the future. That statistical models can get hold of correlations between different ‘variables’ is not enough. If they cannot get at the causal structure that generated the data, they are not really ‘identified.’

How strange that writers of statistics textbook, as a rule, do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical ‘probability.’ In the quest for quantities one puts a blind eye to qualities and looks the other way – but Keynes ideas keep creeping out from under the statistics carpet.

It’s high time that statistics textbooks give Keynes his due.

Seraphim Bit-Kharibi

11 Nov, 2018 at 12:02 | Posted in Varia | Comments Off on Seraphim Bit-Kharibi

 

Text och musik med Eric Schüldt

11 Nov, 2018 at 11:35 | Posted in Varia | Comments Off on Text och musik med Eric Schüldt

radioI dessa tider, när ljudrummet dränks i den kommersiella radions tyckmyckentrutade ordbajseri och fullständigt intetsägande pubertalflamsande tjafs, har man nästan gett upp.

Men det finns ljus i mörkret.

I programmet Text och musik med Eric Schüldt — som sänds på söndagsförmiddagarna i P2 mellan klockan 11 och 12 — kan man lyssna på seriös musik och en programledare som har något att säga och inte bara låter foderluckan glappa. Att få höra någon med intelligens och känsla tala om saker som vi alla går och bär på djupt inne i våra själar — men nästan aldrig vågar prata om — är en lisa för själen.

I dagens program kunde man bland annat höra detta oändligt vackra stycke musik av Eleni Karaindrou:


Tack Eric för ett fantastiskt program som bara det är värt varenda krona som läggs på radio- och tv-licensen!

This is Sweden

10 Nov, 2018 at 19:23 | Posted in Varia | Comments Off on This is Sweden

 

Robert​ Gordon said it all 40 years ago!

10 Nov, 2018 at 18:01 | Posted in Economics | 1 Comment

20120129-xan4hh6p12eraupcmu5pbg38ujWhat is science? One brief definition runs: “A systematic knowledge of the physical or material world.” Most definitions emphasize the two elements in this definition: (1) “systematic knowledge” about (2) the real world. Without pushing this definitional question to its metaphysical limits, I merely want to suggest that if economics is to be a science, it must not only develop analytical tools but must also apply them to a world that is now observable or that can be made observable through improved methods of observation and measurement. Or in the words of the Hungarian mathematical economist Janos Kornai, “In the real sciences, the criterion is not whether the proposition is logically true and tautologically deducible from earlier assumptions. The criterion of ‘truth’ is, whether or not the proposition corresponds to reality” …

One of our most distinguished historians of economic thought, George Stigler, has stated that: “The dominant influence upon the working range of economic theorists is the set of internal values and pressures of the discipline. The subjects of study are posed by the unfolding course of scientific developments.” He goes on to add: “This is not to say that the environment is without influence …” But, he continues, “whether a fact or development is significant depends primarily on its relevance to current economic theory.” What a curious relating of rigor to relevance! Whether the real world matters depends presumably on “its relevance to current economic theory.” Many if not most of today’s economic theorists seem to agree with this ordering of priorities …

Today, rigor competes with relevance in macroeconomic and monetary theory, and in some lines of development macro and monetary theorists, like many of their colleagues in micro theory, seem to consider relevance to be more or less irrelevant … The theoretical analysis in much of this literature rests on assumptions that also fly in the face of the facts … Another related recent development in which theory proceeds with impeccable logic from unrealistic assumptions to conclusions that contradict the historical record, is the recent work on rational expectations …

I have scolded economists for what I think are the sins that too many of them commit, and I have tried to point the way to at least partial redemption. This road to salvation will not be an easy one for those who have been seduced by the siren of mathematical elegance or those who all too often seek to test unrealistic models without much regard for the quality or relevance of the data they feed into their equations. But let us all continue to worship at the altar of science. I ask only that our credo be: “relevance with as much rigor as possible,” and not “rigor regardless of relevance.” And let us not be afraid to ask — and to try to answer the really big questions.

Robert A. Gordon

Good thinking — the thing statistics cannot​ replace

10 Nov, 2018 at 16:06 | Posted in Statistics & Econometrics | 4 Comments

 

As social researchers, we should never equate science with mathematics and statistical calculation. All science entail human judgement, and using mathematical and statistical models don’t relieve us of that necessity. They are no substitutes for thinking and doing real science.

Statistical — and econometric — patterns should never be seen as anything else than possible clues to follow. Behind observable data, there are real structures and mechanisms operating, things that are  — if we really want to understand, explain and (possibly) predict things in the real world — more important to get hold of than to simply correlate and regress observable variables.

Statistics cannot establish the truth value of a fact. Never has. Never will.

Lars P. Syll

Skånska för nybörjare …

10 Nov, 2018 at 14:53 | Posted in Varia | 5 Comments

skånska
 
Mina egna favoriter är definitivt ‘fubbick’ och ‘ålahue’ — mustiga uttryck som med fördel kan användas när man konfronteras med tyckmyckentrutade imbeciller och estetflabbar …

[PS Och vet du vad ‘tröskemada’ betyder får du gärna meddela mig. DS]

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.