Expected utility theory — an ex-parrot

20 Sep, 2020 at 16:16 | Posted in Economics | Leave a comment

If a friend of yours offered you a gamble on the toss of a coin where you could lose €100 or win €200, would you accept it? Many of us probably wouldn’t. But if you were offered to make one hundred such bets, you would probably be willing to accept it, since most of us see that the aggregated gamble of one hundred 50–50 lose €100/gain €200 bets has an expected return of €5000 (and making our probabilistic calculations we find out that there is only a 0.04% ‘risk’ of losing any money).

Unfortunately – at least if you want to adhere to the standard mainstream expected utility theory – you are then considered irrational! A mainstream utility maximizer that rejects the single gamble should also reject the aggregate offer.

Expected utility theory does not explain actual behaviour and choices. But still — although the theory is obviously descriptively inadequate — economists and microeconomics textbook writers gladly continue to use it, as though its deficiencies were unknown or unheard of.

That cannot be the right attitude when facing scientific anomalies!

When models are plainly wrong, you’d better replace them. Or as Matthew Rabin and Richard Thaler have it:

It is time for economists to recognize that expected utility is an ex-hypothesis, so that we can concentrate our energies on the important task of developing better descriptive models of choice under uncertainty.

Yes indeed — expected utility theory is an ‘ex-hypothesis.’

ex-ParrotThis parrot is no more! He has ceased to be! ‘E’s expired and gone to meet ‘is maker! ‘E’s a stiff! Bereft of life, ‘e rests in peace! If you hadn’t nailed ‘im to the perch ‘e’d be pushing up the daisies! ‘Is metabolic processes are now ‘istory! ‘E’s off the twig! ‘E’s kicked the bucket, ‘e’s shuffled off ‘is mortal coil, run down the curtain and joined the bleedin’ choir invisible!! THIS IS AN EX-PARROT!!

An ex-parrot that transmogrifies truth shouldn’t just be marginally mended. It should be replaced!

Erik Syll In Memoriam (personal)

18 Sep, 2020 at 08:24 | Posted in Varia | Leave a comment

.

In loving memory of my father-in-law, Erik Syll, whose funeral took place yesterday at Skeda church, Sweden.

Why economic models do not explain

16 Sep, 2020 at 12:33 | Posted in Theory of Science & Methodology | 5 Comments

ProfessorNancy_CartwrightAnalogue-economy models may picture Galilean thought experiments or they may describe credible worlds. In either case we have a problem in taking lessons from the model to the world. The problem is the venerable one of unrealistic assumptions, exacerbated in economics by the fact that the paucity of economic principles with serious empirical content makes it difficult to do without detailed structural assumptions. But the worry is not just that the assumptions are unrealistic; rather, they are unrealistic in just the wrong way.

Nancy Cartwright

One of the limitations with economics is the restricted possibility to perform experiments, forcing it to mainly rely on observational studies for knowledge of real-world economies.

But still — the idea of performing laboratory experiments holds a firm grip of our wish to discover (causal) relationships between economic ‘variables.’If we only could isolate and manipulate variables in controlled environments, we would probably find ourselves in a situation where we with greater ‘rigour’ and ‘precision’ could describe, predict, or explain economic happenings in terms of ‘structural’ causes, ‘parameter’ values of relevant variables, and economic ‘laws.’

Galileo Galilei’s experiments are often held as exemplary for how to perform experiments to learn something about the real world. Galileo’s heavy balls dropping from the tower of Pisa, confirmed that the distance an object falls is proportional to the square of time and that this law (empirical regularity) of falling bodies could be applicable outside a vacuum tube when e. g. air existence is negligible.

The big problem is to decide or find out exactly for which objects air resistance (and other potentially ‘confounding’ factors) is ‘negligible.’ In the case of heavy balls, air resistance is obviously negligible, but how about feathers or plastic bags?

One possibility is to take the all-encompassing-theory road and find out all about possible disturbing/confounding factors — not only air resistance — influencing the fall and build that into one great model delivering accurate predictions on what happens when the object that falls is not only a heavy ball but feathers and plastic bags. This usually amounts to ultimately state some kind of ceteris paribus interpretation of the ‘law.’

Another road to take would be to concentrate on the negligibility assumption and to specify the domain of applicability to be only heavy compact bodies. The price you have to pay for this is that (1) ‘negligibility’ may be hard to establish in open real-world systems, (2) the generalisation you can make from ‘sample’ to ‘population’ is heavily restricted, and (3) you actually have to use some ‘shoe leather’ and empirically try to find out how large is the ‘reach’ of the ‘law.’

In mainstream economics, one has usually settled for the ‘theoretical’ road (and in case you think the present ‘natural experiments’ hype has changed anything, remember that to mimic real experiments, exceedingly stringent special conditions have to obtain).

In the end, it all boils down to one question — are there any Galilean ‘heavy balls’ to be found in economics, so that we can indisputably establish the existence of economic laws operating in real-world economies?

As far as I can see there some heavy balls out there, but not even one single real economic law.

Economic factors/variables are more like feathers than heavy balls — non-negligible factors (like air resistance and chaotic turbulence) are hard to rule out as having no influence on the object studied.

Galilean experiments are hard to carry out in economics, and the theoretical ‘analogue’ models economists construct and in which they perform their ‘thought-experiments’ build on assumptions that are far away from the kind of idealized conditions under which Galileo performed his experiments. The ‘nomological machines’ that Galileo and other scientists have been able to construct have no real analogues in economics. The stability, autonomy, modularity, and interventional invariance, that we may find between entities in nature, simply are not there in real-world economies. That’s are real-world fact, and contrary to the beliefs of most mainstream economists, they won’t go away simply by applying deductive-axiomatic economic theory with tons of more or less unsubstantiated assumptions.

By this, I do not mean to say that we have to discard all (causal) theories/laws building on modularity, stability, invariance, etc. But we have to acknowledge the fact that outside the systems that possibly fulfil these requirements/assumptions, they are of little substantial value. Running paper and pen experiments on artificial ‘analogue’ model economies is a sure way of ‘establishing’ (causal) economic laws or solving intricate econometric problems of autonomy, identification, invariance and structural stability — in the model world. But they are pure substitutes for the real thing and they don’t have much bearing on what goes on in real-world open social systems. Setting up convenient circumstances for conducting Galilean experiments may tell us a lot about what happens under those kinds of circumstances. But — few, if any, real-world social systems are ‘convenient.’ So most of those systems, theories and models, are irrelevant for letting us know what we really want to know.

To solve, understand, or explain real-world problems you actually have to know something about them — logic, pure mathematics, data simulations or deductive axiomatics don’t take you very far. Most econometrics and economic theories/models are splendid logic machines. But — applying them to the real world is a totally hopeless undertaking! The assumptions one has to make in order to successfully apply these deductive-axiomatic theories/models/machines are devastatingly restrictive and mostly empirically untestable– and hence make their real-world scope ridiculously narrow. To fruitfully analyse real-world phenomena with models and theories you cannot build on patently and known to be ridiculously absurd assumptions. No matter how much you would like the world to entirely consist of heavy balls, the world is not like that. The world also has its fair share of feathers and plastic bags.

The problem articulated by Cartwright is that most of the ‘idealizations’ we find in mainstream economic models are not ‘core’ assumptions, but rather structural ‘auxiliary’ assumptions. Without those supplementary assumptions, the core assumptions deliver next to nothing of interest. So to come up with interesting conclusions you have to rely heavily on those other — ‘structural’ — assumptions.

Whenever model-based causal claims are made, experimentalists quickly find that these claims do not hold under disturbances that were not written into the model. Our own stock example is from auction design – models say that open auctions are supposed to foster better information exchange leading to more efficient allocation. Do they do that in general? Or at least under any real world conditions that we actually know about? Maybe. But we know that introducing the smallest unmodelled detail into the setup, for instance complementarities between different items for sale, unleashes a cascade of interactive effects. Careful mechanism designers do not trust models in the way they would trust genuine Galilean thought experiments. Nor should they …

Economic models frequently invoke entities that do not exist, such as perfectly rational agents, perfectly inelastic demand functions, and so on. As economists often defensively point out, other sciences too invoke non-existent entities, such as the frictionless planes of high-school physics. But there is a crucial difference: the false-ontology models of physics and other sciences are empirically constrained. If a physics model leads to successful predictions and interventions, its false ontology can be forgiven, at least for instrumental purposes – but such successful prediction and intervention is necessary for that forgiveness. The
idealizations of economic models, by contrast, have not earned their keep in this way. So the problem is not the idealizations in themselves so much as the lack of empirical success they buy us in exchange. As long as this problem remains, claims of explanatory credit will be unwarranted.

A. Alexandrova & R. Northcott

In physics, we have theories and centuries of experience and experiments that show how gravity makes bodies move. In economics, we know there is nothing equivalent. So instead mainstream economists necessarily have to load their theories and models with sets of auxiliary structural assumptions to get any results at all in their models.

So why then do mainstream economists keep on pursuing this modelling project?

Continue Reading Why economic models do not explain…

The value of economics — a cost-benefit analysis

16 Sep, 2020 at 09:10 | Posted in Economics | Leave a comment

Screenshot 2020-09-16 at 09.06.35Economists cannot simply dismiss as “absurd” or “impossible” the possibility that our profession has imposed total costs that exceed total benefits. And no, building a model which shows that it is logically possible for economists to make a positive net contribution is not going to make questions about our actual effect go away. Why don’t we just stipulate that economists are now so clever at building models that they can use a model to show that almost anything is logically possible. Then we could move on to making estimates and doing the math.

In the 19th century, when it became clear that the net effect of having a doctor assist a woman in child-birth was to increase the probability that she would die, western society faced a choice:

– Get rid of doctors; or
– Insist that they wash their hands.

I do not want western society to get rid of economists. But to remain viable, our profession needs to be open to the possibility that in a few cases, a few of its members are doing enormous harm; then it must take on a collective responsibility for making sure that everyone keeps their hands clean.

Paul Romer

Mainstream economic theory today is still in the story-telling business whereby economic theorists create mathematical make-believe analogue models of our real-world economic system.

The problem is that without strong evidence, all kinds of absurd claims and nonsense may pretend to be science. Mathematics and logic cannot establish the truth value of facts. 

We have to demand more of a justification than rather watered-down versions of ‘anything goes’ when it comes to the main postulates on which mainstream economics is founded. If one proposes ‘efficient markets’ or ‘rational expectations’ one also has to support their underlying assumptions. As a rule, none is given, which makes it rather puzzling how things like ‘efficient markets’ and ‘rational expectations’ have become the standard modelling assumption made in much of modern macroeconomics. The reason for this sad state of ‘modern’ economics is that economists often mistake mathematical beauty for truth. It would be far better if they instead made sure they “keep their hands clean”!

Kriminalitet och forskning

15 Sep, 2020 at 23:07 | Posted in Politics & Society | 2 Comments

Men kartläggning och analyser av detta slags släktbaserade nätverk som genom hot om våld och trakasserier ut­övar stor makt i invandrartäta förortsområden och genom detta allvarligt förhindrar integration lyser med sin frånvaro …

I Sverige får man ha vilka värderingar man vill. Att omfatta och propagera för till exempel kommunistiska, islamistiska, kristna, reaktionära, feministiska, patriarkala och även totalitära värderingar ingår i de fri- och rättigheter som grundlagen stadgar.

1570096648Men, i Sverige måste man, vilka värderingar man än har och vill propagera för, följa den svenska lagstiftningen. Och det är på denna punkt som den typ av kriminella släktbaserade nätverk som Bäckström Lerneby så förtjänstfullt lagt i dagen är synnerligen problematiska. Vad boken visar är att dessa nätverk i sin lokala miljö etablerar en egen rättsordning som i långa stycken strider mot svensk lag.

Man måste fråga sig varför det är journalistiska insatser och inte forskning som kartlagt och belyst konsekvenserna av detta problem. Ett möjligt skäl till bristen på forskning kan vara områdets ideologiska och politiska laddning. Detta kan ha lett till att forskare som velat ifrågasätta den etablerade bilden av integrationsproblematiken som ett strukturellt problem drivet av majoritetsbefolkningens diskriminering, inte har kunnat verka inom området.

Peter Esaiasson & Bo Rothstein

Att det finns en ideologisk och politisk laddning tror jag är en riktig  hypotes när det gäller förklaringen till att så lite forskning bedrivs på området.

Ett talande exempel är reaktionerna på Brottsförebyggande rådets rapport från ifjol där kopplingen mellan den stora migrantströmmen år 2015 och den efterföljande ökningen av rapporterade sexbrott diskuterades. Slutsatsen av de övergripande tentativa analyserna och osäkra skattningarna var att sambandet är “svagt”.

jerzyDetta hindrade dock inte Jerzy Sarnecki från att i DN påstå att studien ”visar att invandringsvågen inte har påverkat antalet sexualbrott”. Vilket ju är minst sagt anmärkningsvärt eftersom Brå:s rapport inte över huvud bygger på härkomstdata!

Män födda i utlandet är kraftigt överrepresenterade bland de som dömts för att ha begått våldtäkt i Sverige. Detta är fakta — och just därför kan man ju undra varför ledande svenska politiker och brottsforskare inte tyckt att det har varit viktigt eller speciellt intressant att statistiskt belägga våldtäktsmännens etnicitet. Skälet som åberopats — inte minst av Sarnecki — är att man TROR sig veta att de huvudsakliga orsaksfaktorerna är socio-ekonomiska och att fokus på etnicitet bara skulle spela rasism och utlänningsfientlighet i händerna.

Detta försök till bortförklaring är inget konstigt eller ovanligt — åtminstone om vi talar om politik och medier. Där sysslar man dagligen med den typen av resonemang som bygger på haltande logik och halvsanningar. Mer anmärkningsvärt och mer kritisabelt är det när även forskare hänger sig åt dylikt.

För de flesta sociala fenomen föreligger mekanismer och orsakskedjor som till stor del ytterst går att hänföra till socio-ekonomiska faktorer. Så även med stor sannolikhet när det gäller våldsbrott och mer specifikt våldtäktsbrott. Detta betyder dock ingalunda att man vid exempelvis en statistisk regressionsanalys med ‘konstanthållande’ av socio-ekonomiska variabler helt restlöst skulle i någon kausal mening kunna trolla bort andra viktiga faktorer som etnicitet, kultur, etc.

Och detta är pudelns kärna! Socio-ekonomiska faktorer ÄR viktiga. Men så även andra faktorer. Att dessa i någon mening skulle kunna uppfattas som ‘känsliga’ att kartlägga är inget försvar för att i vetenskapliga sammanhang blunda för dem — något som borde vara självklart för alla forskare och myndighetsföreträdare.

Inte minst Sarnecki har under lång tid och vid upprepade tillfällen tvärsäkert hävdat att våldtäkter bara kan förstås och förklaras som resultat av socio-ekonomiska faktorer. Några entydiga evidensbaserade forskningsresultat som skulle kunna utgöra grund för denna tvärsäkerhet föreligger dock inte.

Att hävda att det kan finnas andra ‘förklaringsfaktorer’ — som t ex etnicitet och kultur — stämplas som ‘farligt.’ Detta är långt ifrån första gången i historien som ny kunskap, data och vetenskapliga teorier ifrågasätts utifrån en rädsla för att de kan ha negativa samhälleliga konsekvenser (Galileos och Darwins nya fakta och kunskaper om astronomi och evolution möttes först med invändningar och krav på hemlighållande från dåtidens etablissemang).

‘Fakta sparkar’ som Gunnar Myrdal brukade säga. Att av rädsla för att fakta kan missbrukas välja att mörklägga information om stora och viktiga samhällsproblem som brott och våld är fullständigt oacceptabelt. Det är ett svek både mot samhället i stort och de personer som utsätts för brotten och våldet.

Mer — inte mindre — fakta och kunskap, är en förutsättning för att på ett effektivt sätt kunna minska förekomsten av våld och brott i vårt samhälle. Ett samhälle måste ha förtroende för sina medborgares förmåga att hantera information. Avsaknad av det förtroendet är något som vi förknippar med auktoritära samhällen. I en demokrati mörklägger man inte information!

Friedman-Savage and Keynesian uncertainty

14 Sep, 2020 at 15:50 | Posted in Economics | 7 Comments

0

An objection to the hypothesis just
presented that is likely to be raised by
many … is that it conflicts with the way human beings actually behave and choose. … Is it not patently unrealistic to suppose that individuals … base their decision on the size of the
expected utility?

While entirely natural and under-
standable, this objection is not strictly relevant … The hypothesis asserts rather that, in making a particular class of decisions, individuals behave as if they calculated and compared expected utility and as if they knew the odds. The validity of this assertion … depend  solely on whether it yields sufficiently accurate predictions about the class of decisions
with which the hypothesis deals.

M Friedman & L J Savage

‘Modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — still follows the Friedman-Savage ‘as if’ logic of denying the existence of genuine uncertainty and treat variables as if drawn from a known ‘data-generating process’ with known probability distribution that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course, it has to. Who really honestly believes that we have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30​% and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end,​ this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only ‘rational’ one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

wrongrightSome macroeconomists, however, still want to be able to use their hammer. So they — like Friedman and Savage — decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead, we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing – economic catastrophe!

La folie identitaire et les statistiques ethniques

12 Sep, 2020 at 19:16 | Posted in Politics & Society | Leave a comment

Skolval och segregation

12 Sep, 2020 at 16:05 | Posted in Education & School | 1 Comment

skolsprickaDen sociala och etniska segregationen är ett högst konkret problem som märks i våra skolor och på de platser där barnen rör sig och möts – eller inte möts. Därför vill vi i denna rapport visa hur skolsegregationen ser ut på lokal nivå. Genom att studera samtliga 30 kommuner med 50–100 000 invånare vill vi illustrera det som är barnens men också lärares, rektorers och kommunala skolförvaltningars vardag runt om i Sverige: en kraftfull segregation … Av totalt 356 högstadieskolor i de 30 studerade kommunerna är 147 (det vill säga fyra av tio) segregerade … Av eleverna i årskurs nio som går i segregerade skolor med hög andel svenskfödda och/eller högutbildade föräldrar, går 49 procent i friskolor …

Enligt såväl internationell som svensk forskning används möjligheten att välja skola främst av de mer väletablerade i samhället, vilket i Sverige oftast är svenskfödda personer med högre utbildningsnivå. Denna segregation orsakas vanligen av att mer väletablerade familjer väljer bort skolor och grupper av människor. Även om vi fokuserat på friskolor i denna rapport, är det viktigt att påminna sig om att samma mekanism, att välja bort icke önskvärda skolor och elevgrupper, på samma sätt gör kommunala skolor segregerade … Vår rapport visar att dagens svenska skola verkar i motsatt riktning och förstärker socioekonomiska och kunskapsmässiga skillnader mellan elever – den söndrar och bryter ner mer än den enar och rustar.

Per Kornhall & German Bender

Den segregation som det ‘fria’ skolvalet och kommunaliseringen av skolan gett upphov till är i sig illa nog, men följderna vi kan se i samhället av att elever med samma etnisk-socio-ekonomiska bakgrund koncentreras till vissa skolor är ännu mer förödande. Skolresultaten skiljer sig allt mer mellan skolor och elever, och kanske värst av allt, så ökar skillnaderna. Mekanismerna bakom detta är många (uteblivna kamratgruppseffekter, olika resurstilldelning, etc), men efter att själv ha varit ute på skolor under några veckor har yours truly på plats kunnat konstatera att det på flera skolor verkligen står illa till.

Skolan ska vara inkluderande och kompensatorisk. Men så ser det inte ut i verkligheten. Vi lever mer och mer i olika parallellsamhällen med tillhörande parallella skolvärldar. Det kitt som skolan en gång var för att skapa samhörighet och gemenskap finns inte längre. Skolor där majoriteten av elever har annan etnisk-socio-ekonomisk bakgrund än den egna ratas. Man söker sig till skolor där eleverna har samma bakgrund som man själv. Lika söker lika.

Självklart kan det inte uppfattas som annat än ett kapitalt misslyckande om en skola med kompensatoriska aspirationer uppvisar ett mönster där etnicitet och föräldrars utbildningsbakgrund får allt större genomslag på skolresultaten.

Man slås snabbt av att de skolor där man har de största och svåraste problemen har man också påfallande ofta inte de bäst ämnade lärarna. Duktiga lärare söker sig till de mer välfungerande skolorna och ökar på så vis ytterligare segregationens effekter. Detta borde så klart ansvarstagande politiker se till att åtgärda. Det kräver ingen ‘rocket science’ för att räkna ut att man måste göra det mer attraktivt för ‘duktiga’ lärare att arbeta på de ‘svåra’ skolorna (högre lön, mindre undervisningstid, mer tid för planering, etc).

I skollagen borde det ingå som en självklar portalparagraf att skolans huvudmän ska ha skyldighet se till så att skolor har en allsidig etnisk-socio-ekonomisk sammansättning. Så är det inte idag. Och det är en skandal.

Skolan borde vara den fasta punkt i unga människors tillvaro dit de kan komma och – liksom lärare – temporärt dra sig undan familje- och samhällslivets stormar. Skolan borde få vara en ö i en värld full av intensiva förändringar. Vi har alla flera olika identiteter, aspirationer, bakgrunder och drömmar. Men i skolan borde vi mötas som jämlikar. Olika, men jämlika.

Ska skolan kunna katalysera och förändra måste den vara något annat och inte identiskt med sin omgivning. I dagens samhälle måste skolan få fungera som något annorlunda, ett alternativ till de eroderande marknadskrafter som idag hotar samhällsbygget genom att reducera samhällsmedborgare till konsumenter. När inte familjen eller samhället står emot måste skolan kunna stå upp och ta tillvara det uppväxande släktets genuina emancipatoriska intressen

Skolan ska fostra kunskapande medborgare. En skola med religiösa, etniska, eller vinstgivande bevekelsegrunder är ingen bra skola. Skolan ska möta eleverna utifrån vad de kan bli och inte utifrån vad de är.

Några riktigt smarta miljötips …

11 Sep, 2020 at 17:18 | Posted in Varia | Leave a comment

Renaud

10 Sep, 2020 at 18:01 | Posted in Varia | Leave a comment

Michael Woodford on models

10 Sep, 2020 at 16:16 | Posted in Economics | 18 Comments

woodfordBut I do not believe that the route to sounder economic reasoning will involve an abandonment of economists’ penchant for reasoning with the use of models. Models allow the internal consistency of a proposed argument to be checked with greater precision; they allow more finely-grained differentiation among alternative hypotheses, and they allow longer and more subtle chains of reasoning to be deployed without both author and reader becoming hopelessly tangled in them. Nor do I believe it is true that economists who are more given to the use of formal mathematical analysis are generally more dogmatic in their conclusions than those who customarily rely upon more informal styles of argument. Often, reasoning from formal models makes it easier to see how strong are the assumptions required for an argument to be valid, and how different one’s conclusions may be depending on modest changes in specific assumptions. And whether or not any given practitioner of economic modeling is inclined to honestly assess the fragility of his conclusions, the use of a model to justify those conclusions makes it easy for others to see what assumptions have been relied upon, and hence to challenge them. As a result, the resort to argumentation based on models facilitates the general project of critical inquiry that represents, in my view, our best hope for some eventual approach toward truth.

Michael Woodford

This is — sad to say — a rather typical view among mainstream economists today. Defending the use of unrealistic and unsubstantiated models with the argument that models make it “easy for others to see what assumptions have been relied upon, and hence to challenge them” is rather far-fetched. It’s like arguing: “We can’t understand what is going on in our complex and uncertain world, so let us set up ‘small-world’ models in which we assume away the complexities and reduce genuine uncertainty to calculable risk, and then let us, with precision and rigour, look at those assumptions and challenge them.” Yours truly fails to see the point. 

Mainstream economic theory today is in the story-telling business whereby economic theorists create make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. And it’s used both in micro- and macroeconomics. Since everything the economist wants to know is put in to the model, it’s a piece of cake to prove whatever in a ‘rigorous’ and valid way. Deductive certainty is achieved — in the model. Unfortunately, the price one has to pay for getting at ‘rigorous’ and precise results in this way, is making outright ridiculous assumptions that actually impair the possibility of having anything of interest to say about the real world.

Since fully-fledged experiments on a societal scale as a rule are prohibitively expensive, ethically indefensible or unmanageable, economic theorists have to go for something else. To understand and explain relations between different entities in the real economy the predominant strategy is to build models — the preferred stand-in for real experiments — and make things happen in these ‘analogue-economy models’ rather than engineering things happening in real economies.

Mainstream economics has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. The one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics, is a scientific cul-de-sac.

Avoiding logical inconsistencies is crucial in all science. But it is not enough. Just as important is avoiding factual inconsistencies. And without showing — or at least warrantedly arguing — that the assumptions and premises of their models are in fact true, mainstream economists aren’t really reasoning, but only playing games. Formalistic deductive ‘Glasperlenspiel’ can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality.

Mainstream theoretical economics is still under the spell of the Bourbaki tradition in mathematics. Theoretical rigour is everything. Studying real-world economies and empirical corrobation/falsification of theories and models nothing. Separating questions of logic and empirical validity may — of course — help economists to focus on producing rigorous and elegant mathematical theorems that Woodford et consortes consider as “progress in economic thinking.” To most other people, not being concerned with empirical evidence and model validation is a sign of social science becoming totally useless and irrelevant. Economic theories building on known to be ridiculously artificial assumptions without an explicit relationship with the real world is a dead end. That’s probably also the reason why Neo-Walrasian general equilibrium analysis today (at least outside Chicago) is considered a total waste of time. In the trade-off between relevance and rigour, priority should always be on the former when it comes to social science. The only thing followers of the Bourbaki tradition within economics — like von Neumann, Debreu, Lucas, and Sargent — has given us are irrelevant model abstractions with no bridges to real-world economies. It’s difficult to find a more poignant example of a total waste of time in science.

If the real world is fuzzy, vague and indeterminate, then why should our models build upon a desire to describe it as precise and predictable? The logic of idealization is a marvellous tool in mathematics and axiomatic-deductivist systems, but a poor guide for action in real-world systems, where concepts and entities often are without clear boundaries and continually interact and overlap.

Being told that the model is rigorous and amenable to ‘successive approximations’ to reality is of little avail, especially when the law-like (nomological) core assumptions are highly questionable. Being able to construct ‘thought-experiments’ depicting logical possibilities does not take us very far. An obvious problem with mainstream economic models is that they are formulated in such a way that they realiter is extremely difficult to empirically test and decisively ‘corroborate’ or ‘falsify.’

Contrary to Woodford, I would argue such models have — from an explanatory point of view — no value at all. The ‘thinness’ is bought at too high a price unless you decide to leave the intended area of application unspecified or immunize your model by interpreting it as nothing more than a set of assumptions making up a content-less theoretical system with no connection whatsoever to reality.

On war and economics

9 Sep, 2020 at 12:31 | Posted in Economics | 3 Comments

clusThey soon found out how difficult the subject was, and felt justified in evading the problem by again directing their principles and systems only to physical matters and unilateral activity. As in the science concerning the preparations for war, they wanted to reach a set of sure and positive conclusions and for that reason considered only factors that could be mathematically calculated …

It is only analytically that these attempts at theory can be called advances in the realm of truth; synthetically, in the rules and regulations they offer, they are absolutely useless. They aim at fixed values; but in war everything is uncertain, and calculations have to be made with variable quantities. They direct the inquiry exclusively toward physical quantities, whereas all military action is intertwined with psychological forces and effects. They consider only unilateral action, whereas war consists of a continuous interaction of opposites.

Models may help us think through problems. But we should never forget that the formalism we use in our models is not self-evidently transportable to a largely unknown and uncertain reality. The tragedy with mainstream economic theory — and the ‘theorists’ that Clausewitz criticised — is that it thinks that the logic and mathematics used are sufficient for dealing with our real-world problems. They are not. Model deductions based on questionable assumptions can never be anything but pure exercises in hypothetical reasoning. And that kind of reasoning cannot establish the truth value of facts. Never has. Never will.

Does it — really — take a model to beat a model? No!

8 Sep, 2020 at 21:37 | Posted in Economics | 9 Comments

Many economists respond to criticism by saying that ‘all models are wrong’ … But the observation that ‘all models are wrong’ requires qualification by the second part of George Box’s famous aphorism — ‘but some are useful’ … The relevant  criticism of models in macroeconomics and finance is not that they are ‘wrong’ but that they have not proved useful in macroeconomics and have proved misleading in finance.

kaykingWhen we provide such a critique, we often hear another mantra to which many economists subscribe: ‘It takes a model to beat a model.’ On the contrary, we believe that it takes facts and observations to beat a model … If a model fails to answer the problem to which it is addressed, it should be put back in the toolbox … It is not necessary to have an alternative tool available to know that the plumber who arrives armed only with a screwdriver is not the tradesman we need.

A similar critique yours truly sometimes encounters is that as long as I cannot come up with some own alternative model to the failing mainstream models, I shouldn’t expect people to pay attention.

This is, however, not only wrong for the reasons given by Kay and King, but is also to utterly misunderstand the role of philosophy and methodology of economics!

As John Locke wrote in An Essay Concerning Human Understanding:

19557-004-21162361The Commonwealth of Learning is not at this time without Master-Builders, whose mighty Designs, in advancing the Sciences, will leave lasting Monuments to the Admiration of Posterity; But every one must not hope to be a Boyle, or a Sydenham; and in an Age that produces such Masters, as the Great-Huygenius, and the incomparable Mr. Newton, with some other of that Strain; ’tis Ambition enough to be employed as an Under-Labourer in clearing Ground a little, and removing some of the Rubbish, that lies in the way to Knowledge.

That’s what philosophy and methodology can contribute to economics — clearing obstacles to science by clarifying limits and consequences of choosing specific modelling strategies, assumptions, and ontologies.

unnameadIt takes a model to beat a model has to be one of the stupider things, in a pretty crowded field, to come out of economics. … I don’t get it. If a model is demonstrably wrong, that should surely be sufficient for rejection. I’m thinking of bridge engineers: ‘look I know they keep falling down but I’m gonna keep building them like this until you come up with a better way, OK?’

Jo Michell

On ergodicity and epistemological vs. ontological uncertainty

7 Sep, 2020 at 17:39 | Posted in Economics | 3 Comments

A couple of years ago yours truly had a discussion on the real-world economics review blog with Paul Davidson on ergodicity and the differences between Knight and Keynes re uncertainty. It all started with me commenting on Davidson’s article Is economics a science? Should economics be rigorous? :

LPS:

pasteDavidson’s article is a nice piece – but ergodicity is a difficult concept that many students of economics have problems with understanding. To understand real-world ”non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past are not those that will rule the future.

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

When you assume the economic processes to be ergodic, ensemble and time averages are identical. Let me give an example: Assume we have a market with an asset priced at 100 €. Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be 100 €- because we here envision two parallel universes (markets) where the asset-price falls in one universe (market) with 50% to 50 €, and in another universe (market) it goes up with 50% to 150 €, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset-price first rises by 50% to 150 €​ and then falls by 50% to 75 € (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen.

Assuming ergodicity there would have been no difference at all.

Just in case you think this is just an academic quibble without repercussion to our real lives, let me quote from an article of physicist and mathematician Ole Peters in the Santa Fe Institute Bulletin from 2009 – “On Time and Risk” – that makes it perfectly clear that the flaw in thinking about uncertainty in terms of “rational expectations” and ensemble averages has had real repercussions on the functioning of the financial system:

“In an investment context, the difference between ensemble averages and time averages is often small. It becomes important, however, when risks increase​ when correlation hinders diversification​​ when leverage pumps up fluctuations, when money is made cheap, when capital requirements are relaxed. If reward structures—such as bonuses that reward gains but don’t punish losses, and also certain commission schemes—provide incentives for excessive risk, problems arise. This is especially true if the only limits to risk-taking derive from utility functions that express risk preference, instead of the objective argument of time irreversibility. In other words, using the ensemble average without sufficiently restrictive utility functions will lead to excessive risk-taking and eventual collapse. Sound familiar?”

PD:

Lars, if the stochastic process is ergodic, then for ​an infinite realization​, the time and space (ensemble) averages will coincide. An ensemble a is samples drawn at a fixed point of time drawn from a universe of realizations For finite realizations, the time and space statistical averages tend to converge (with a probability of one) the more data one has.

Even in physics,​ there are some processes that physicists recognize are governed by nonergodic stochastic processes. [see A. M. Yaglom, An Introduction to Stationary Random Functions [1962, Prentice Hall]]

I do object to Ole Peters exposition quote where he talks about “when risks increase”. Nonergodic systems are not about increasing or decreasing risk in the sense of the probability distribution variances differing. It is about indicating that any probability distribution based on past data cannot be reliably used to indicate the probability distribution governing any future outcome. In other words even if (we could know) that the future probability distribution will have a smaller variance (“lower risks”) than the past calculated probability distribution, then the past distribution is not​ a reliable guide to future statistical means and other moments around the means.

LPS:

Paul, re nonergodic processes in physics I would even say that most processes definitely are nonergodic. Re Ole Peters I totally agree that what is important with the fact that real social and economic processes are nonergodic is the fact that uncertainty – not risk – rules the roost. That was something both Keynes and Knight basically said in their 1921 books. But I still think that Peters’ discussion is a good example of how thinking about uncertainty in terms of “rational expectations” and “ensemble averages” has had seriously bad repercussions on the financial system.

PD:

Lars, there is a difference between the uncertainty concept developed by Keynes and the one developed by Knight.

As I have pointed out, Keynes’s concept of uncertainty involves a nonergodic stochastic process.​ On the other hand, Knight’s uncertainty — like Taleb’s black swan — assumes an ergodic process. The difference is the for Knight (and Taleb) the uncertain outcome lies so far out in the tail of the unchanging (over time) probability distribution that it appears empirically to be [in Knight’s terminology] “unique”. In other words, like Taleb’s black swan, the uncertain outcome already exists in the probability distribution but is so rarely observed that it may take several lifetimes for one observation — making that observation “unique”.

In the latest edition of Taleb’s book,​ he was forced to concede that philosophically there is a difference between a nonergodic system and a black swan ergodic system –but then waves away the problem with the claim that the difference is irrelevant​.

Continue Reading On ergodicity and epistemological vs. ontological uncertainty…

Covid-19 — comme le disait Keynes, nous ne savons tout simplement pas!

7 Sep, 2020 at 14:03 | Posted in Economics | 2 Comments

La pandémie bouscule les économistes. Sauront-ils nous aider à appréhender l’incertitude de notre avenir collectif ? La question plonge ses racines dans des débats anciens sur la possibilité même de penser un futur incertain …

Il y a exactement un siècle, en 1921, l’économiste américain Frank Knight (1885-1972) pose les fondements analytiques des théories économiques contemporaines de l’incertitude en opposant le risque, que l’on peut mesurer en termes de probabilités, à l’incertitude radicale, qui se caractérise par l’impossibilité de la prévoir ou de la mesurer.

Mais c’est aussi en 1921 qu’apparaît une critique radicale de cette approche. John Maynard Keynes (1883-1946) publie sa thèse de doctorat en mathématiques, sur laquelle il travaille depuis quinze ans. Il y démontre l’impossibilité méthodologique et philosophique pour les économistes d’élaborer des modèles mathématiques qui leur permettraient de théoriser le long terme …

Depuis un siècle, les techniques mathématiques se sont affinées, mais le débat reste entier. Dans leur grande majorité, les économistes considèrent qu’en dehors des situations d’incertitude radicale, leurs modèles prévisionnels sont fiables. Mais si l’on retient la leçon de Keynes, reprise dans son ouvrage majeur, la Théorie générale de l’emploi, de l’intérêt de la monnaie (1935), « il n’existe aucune base sur laquelle il soit possible de former une probabilité calculable », puisque « nous ne savons tout simplement pas » ce qui se produira dans le futur.
« We simply do not know »… Et si l’actualité lui donnait raison ?

Annie Cot/Le Monde

Like the present pandemic, the financial crisis of 2007-2008 also hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even made it conceivable?

There are many who have ventured to answer that question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians.

0But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course it has to. I mean, who honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

4273570080_b188a92980This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability, this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only ‘rational’ one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

wrongrightSome macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better — how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control — if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing — economic disaster.

Next Page »

Blog at WordPress.com.
Entries and comments feeds.