26 Aug, 2018 at 20:22 | Posted in Varia | Comments Off on Prayer


Lill forever

26 Aug, 2018 at 20:16 | Posted in Varia | Comments Off on Lill forever


Technobabble economics

26 Aug, 2018 at 13:05 | Posted in Economics | Comments Off on Technobabble economics


Deductivist modeling endeavours and an overly simplistic use of statistical and econometric tools are sure signs of the explanatory hubris that still haunts mainstream economics.

In an interview Robert Lucas says

the evidence on postwar recessions … overwhelmingly supports the dominant importance of real shocks.

So, according to Lucas, changes in tastes and technologies should be able to explain the main fluctuations in e.g. the unemployment that we have seen during the last six or seven decades. But really — not even a Nobel laureate could in his wildest imagination come up with any warranted and justified explanation solely based on changes in tastes and technologies.

The Chicago übereconomist is simply wrong. But how do we protect ourselves from this kind of scientific nonsense? In The Scientific Illusion in Empirical Macroeconomics Larry Summers has a suggestion well worth considering — not the least since it makes it easier to understand how mainstream economics actively contribute to causing economic crises rather than to solve them:

technoA great deal of the theoretical macroeconomics done by those professing to strive for rigor and generality, neither starts from empirical observation nor concludes with empirically verifiable prediction …

The typical approach is to write down a set of assumptions that seem in some sense reasonable, but are not subject to empirical test … and then derive their implications and report them as a conclusion …

What then do these exercises teach us about the world? … If empirical testing is ruled out, and persuasion is not attempted, in the end I am not sure these theoretical exercises teach us anything at all about the world we live in …

Serious economists who respond to questions about how today’s policies will affect tomorrow’s economy by taking refuge in technobabble about how the question is meaningless in a dynamic games context abdicate the field to those who are less timid. No small part of our current economic difficulties can be traced to ignorant zealots who gained influence by providing answers to questions that others labeled as meaningless or difficult. Sound theory based on evidence is surely our best protection against such quackery.

Tractability, truth, and ignorability

25 Aug, 2018 at 15:37 | Posted in Statistics & Econometrics | 1 Comment

Most attempts at causal inference in observational studies are based on assumptions that treatment assignment is ignorable. Such assumptions are usually made casually, largely because they justify the use of available statistical methods and not because they are truly believed.

Marshall Joffe et al.

An interesting (but from a technical point of view rather demanding) article on a highly questionable assumption used in ‘potential outcome’ causal models. It made yours truly come to think of how tractability has come to override reality​ and truth also in moder​n mainstream economics.

Having a ‘tractable’ model is of course great, since it usually means that you can solve it. But — using ‘simplifying’ tractability assumptions (rational expectations, common knowledge, representative agents, linearity, additivity, ergodicity, etc.) because otherwise they cannot ‘manipulate’ their models or come up with ‘rigorous ‘ and ‘precise’ predictions and explanations, does not exempt economists from having to justify their modelling choices. Being able to ‘manipulate’ things in models cannot per se be enough to warrant a methodological choice. If economists do not think their tractability assumptions make for good and realist models, it is certainly a just question to ask for clarification of the ultimate goal of the whole modelling endeavour.

Take for example the ongoing discussion on rational expectations as a modelling assumption. Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies are those based on rational expectations and representative actors models. As yours truly has tried to show in On the use and misuse of theories and models in mainstream economics there is really no support for this conviction at all. If microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is not if we — once we have made our tractability assumptions — can ‘manipulate’ them, but the real world. And as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand-waving that give us rather a little warrant for making inductive inferences from models to real-world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

Berkson’s paradox or why attractive people you date tend​ to be jerks

24 Aug, 2018 at 15:16 | Posted in Statistics & Econometrics | 4 Comments

The Book of Why_coverHave you ever noticed that, among the people you date, the attractive ones tend to be jerks? Instead of constructing elaborate psychosocial theories, consider a simpler explanation. Your choice of people to date depends on two factors, attractiveness and personality. You’ll take a chance on dating a mean attractive person or a nice unattractive person, and certainly a nice attractive person, but not a mean unattractive person … This creates a spurious negative correlation between attractiveness and personality. The sad truth is that unattractive people are just as mean as attractive people — but you’ll never realize it, because you’ll never date somebody who is both mean and unattractive.

Wage discrimination and the dangers of ‘controlling for’ confounders

24 Aug, 2018 at 09:08 | Posted in Economics, Statistics & Econometrics | Comments Off on Wage discrimination and the dangers of ‘controlling for’ confounders

You see it all the time in studies. “We controlled for…” And then the list starts. The longer the better. Income. Age. Race. Religion. Height. Hair color. Sexual preference. Crossfit attendance. Love of parents. Coke or Pepsi. The more things you can control for, the stronger your study is — or, at least, the stronger your study seems. Controls give the feeling of specificity, of precision. But sometimes, you can control for too much. Sometimes you end up controlling for the thing you’re trying to measure …

wage-gapThe problem with controls is that it’s often hard to tell the difference between a variable that’s obscuring the thing you’re studying and a variable that is the thing you’re studying.

An example is research around the gender wage gap, which tries to control for so many things that it ends up controlling for the thing it’s trying to measure. As my colleague Matt Yglesias wrote:

“The commonly cited statistic that American women suffer from a 23 percent wage gap through which they make just 77 cents for every dollar a man earns is much too simplistic. On the other hand, the frequently heard conservative counterargument that we should subject this raw wage gap to a massive list of statistical controls until it nearly vanishes is an enormous oversimplification in the opposite direction. After all, for many purposes gender is itself a standard demographic control to add to studies — and when you control for gender the wage gap disappears entirely! The question to ask about the various statistical controls that can be applied to shrink the gender gap is what are they actually telling us. The answer, I think, is that it’s telling how the wage gap works.”

Take hours worked, which is a standard control in some of the more sophisticated wage gap studies. Women tend to work fewer hours than men. If you control for hours worked, then some of the gender wage gap vanishes. As Yglesias wrote, it’s “silly to act like this is just some crazy coincidence. Women work shorter hours because as a society we hold women to a higher standard of housekeeping, and because they tend to be assigned the bulk of childcare responsibilities.”

Controlling for hours worked, in other words, is at least partly controlling for how gender works in our society. It’s controlling for the thing that you’re trying to isolate.

Ezra Klein

The gender pay gap is a fact that, sad to say, to a non-negligible extent is the result of discrimination. And even though many women are not deliberately discriminated against, but rather self-select into lower-wage jobs, this in no way magically explains away the discrimination gap. As decades of socialization​ research has shown, women may be ‘structural’ victims of impersonal social mechanisms that in different ways aggrieve them. Wage discrimination is unacceptable. Wage discrimination​ is a shame.

The cost of focusing on general equilibrium theory

24 Aug, 2018 at 07:34 | Posted in Economics | 2 Comments

The largest problem with the economics profession’s focus on general equilibrium theory is the opportunity costs of that exploration. Important policy problems are not addressed. Consider Pareto optimality and the welfare theorems, which Fisher sees as the underpinnings of Western capitalism. In a world, such as ours, where property rights cannot be allocated effortlessly and costlessly, economist’s welfare theorems have little policy relevance. Does a policy maker care whether any Pareto efficient allocation can be decentralized as a competitive general equilibrium? The chance of discovering a real-world Pareto optimal policy that can be shown to harm no one in some infinitesimal way is essentially nil.

what ifBy focusing their theoretical policy analysis on Pareto optimal policies, economists avoid coming to grips theoretically with the messy value judgments that must be made in the policy space, which means that their theoretical models provide little guidance on how to deal with the messy problems of actual policy that are designed to achieve both efficiency and fairness. In its almost exclusive focus on efficiency, modern economists have moved away from Classical economist’s utilitarian moral philosophy that underlay Classical economist’s support of markets. Classical economists supported markets because they worked reasonably well in the real world, not because of any deductive proof of the benefits of markets.

Dave Colander

Reformulating the economics curriculum

23 Aug, 2018 at 07:49 | Posted in Economics | 2 Comments

Having gone through a handful of the most frequently used textbooks of economics at the undergraduate level today, I can only conclude that the models that are presented in these modern mainstream textbooks try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent.

madiThat is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972), Rolf Mantel (1976) and Gérard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equilibrium solution.

So what modern economics textbooks present to students are really models built on the assumption that an entire economy can be modelled as a representative actor and that this is a valid procedure. But it is not, as the Sonnenschein-Mantel-Debreu theorem irrevocably has shown.

Of course one could say that it is too difficult on undergraduate levels to show why the procedure is right and to defer it to master and doctoral courses. It could justifiably be reasoned that way — if what you teach your students is true, if The Law of Demand is generalizable to the market level, and the representative actor is a valid modelling abstraction! But in this case, it is demonstrably known to be false, and therefore this is nothing but a case of scandalous intellectual dishonesty. It’s like telling your students that 2 + 2 = 5 and hope that they will never run into Peano’s axioms of arithmetics.

For almost forty years mainstream economics itself has lived with a theorem that shows the impossibility of extending the microanalysis of consumer behaviour to the macro level (unless making patently and admittedly ridiculous assumptions). Still after all these years pretending in their textbooks that this theorem does not exist — none of the textbooks I investigated even mention the existence of the Sonnenschein-Mantel-Debreu theorem — is really outrageous.

Uppdrag granskning och sveket mot våldtäktsoffren

22 Aug, 2018 at 19:12 | Posted in Politics & Society | 3 Comments

grafik3-jpgYours truly såg idag ett intressant avsnitt av SvT:s Uppdrag granskning.

Programmet handlade om att män födda i utlandet är kraftigt överrepresenterade bland de som dömts för att ha begått våldtäkt i Sverige — och, enkannerligen, om varför ledande svenska politiker och brottsforskare inte tyckt att det har varit viktigt eller speciellt intressant att statistiskt belägga våldtäktsmännens etnicitet. Skälet som åberopats av politiker (som t ex Morgan Johansson) och forskare (som t ex Jerzy Sarnecki) är att de TROR att de huvudsakliga orsaksfaktorerna är socio-ekonomiska och att fokus på etnicitet bara skulle spela rasism och utlänningsfientlighet i händerna.

Detta försök till bortförklaring är inget konstigt eller ovanligt — åtminstone om vi talar om politik och medier. Där sysslar man dagligen med den typen av resonemang som bygger på haltande logik och halvsanningar. Mer anmärkningsvärt och mer kritisabelt är det när även forskare hänger sig åt dylikt.

För de flesta sociala fenomen föreligger mekanismer och orsakskedjor som till stor del ytterst går att hänföra till socio-ekonomiska faktorer. Så även med stor sannolikhet när det gäller våldsbrott och mer specifikt våldtäktsbrott. Detta betyder dock ingalunda att man vid exempelvis en statistisk regressionsanalys med ‘konstanthållande’ av socio-ekonomiska variabler helt restlöst skulle i någon kausal mening kunna trolla bort andra viktiga faktorer som etnicitet, kultur, etc.

Och detta är pudelns kärna! Socio-ekonomiska faktorer ÄR viktiga. Men så även andra faktorer. Att dessa i någon mening skulle kunna uppfattas som ‘känsliga’ att kartlägga är inget försvar för att i vetenskapliga sammanhang blunda för dem — något som borde vara självklart även för Brottsförebyggande rådet och Jerzy Sarnecki.

Inte minst Sarnecki har under lång tid och vid upprepade tillfällen tvärsäkert hävdat att våldtäkter bara kan förstås och förklaras som resultat av socio-ekonomiska faktorer. Några entydga evidensbaserade forskningsresultat som skulle kunna utgöra grund för denna tvärsäkerhet föreligger dock inte.

Att hävda att det kan finnas andra faktorer — som t ex etnicitet och kultur — stämplas som ‘farligt’. Detta är långt ifrån första gången i historien som ny kunskap, data och vetenskapliga teorier ifrågasätts utifrån en rädsla för att de kan ha negativa samhälleliga konsekvenser (Galileos och Darwins nya fakta och kunskaper om astronomi och evolution möttes först med invändningar och krav på hemlighållande från dåtidens etablissemang).

‘Fakta sparkar’ som Gunnar Myrdal brukade säga. Att av rädsla för att fakta kan missbrukas välja att mörklägga information om stora och viktiga samhällsproblem som våldtäkter är fullständigt oacceptabelt. Det är ett svek både mot samhället i stort och personer som utsätts för brotten.

Mer — inte mindre — fakta och kunskap, är en förutsättning för att på ett effektivt sätt kunna minska förekomsten av våldtäkter och andra brott i vårt samhälle. Ett samhälle måste ha förtroende för sina medborgares förmåga att hantera information. Avsaknad av det förtroendet är något som vi förknippar med auktoritära samhällen. I en demokrati mörklägger man inte information!

August 21, 1968 — a date which will live in infamy

21 Aug, 2018 at 14:33 | Posted in Politics & Society | Comments Off on August 21, 1968 — a date which will live in infamy



Donald Rubin on randomization and observational studies

21 Aug, 2018 at 12:48 | Posted in Theory of Science & Methodology | Comments Off on Donald Rubin on randomization and observational studies



20 Aug, 2018 at 15:15 | Posted in Economics | 1 Comment

Unfortunately, nothing is more dangerous than dogmas donned with scientific feathers. The current crisis might offer an excellent occasion for a paradigm
change, previously called for by prominent economists like John Maynard Keynes, Alan Kirman and Steve Keen. They have forcefully highlighted the shortcomings
and contradictions of the classical economic theory, but progress has been slow. The task looks so formidable that some economists argue that it is better to stick with the implausible but well-corseted theory of perfectly rational agents than to venture into modelling the infinite number of ways agents can be irrational.

miracle_cartoonPhysicists, however, feel uncomfortable with theories not borne out by (or even blatantly incompatible with) empirical data. But could the methodology of physics really contribute to the much-awaited paradigm shift in economics? …

Econophysics is in fact a misnomer, since most of its scope concerns financial markets. To some economists, finance is a relatively minor subfield and any contribution, even the most significant, can only have a limited impact on economics science at large. I personally strongly disagree with this viewpoint: recent events confirm that hiccups in the financial markets can cripple the entire economy.

From a more conceptual point of view, financial markets are an ideal laboratory for testing several fundamental concepts of economics. Are prices really such that supply matches demand? Are price moves primarily due to news? (The answer to both these questions seem to be clear “no”) … As I will try to illustrate, the very choice of the relevant questions, which ultimately leads to a deeper understanding of the data, is often sheer serendipity: more of an art than a science. That intuition, it seems to me, is well nurtured by an education in the natural sciences, where the emphasis is on mechanisms and analogies, rather than on axioms and theorem proving.

Jean-Philippe Bouchaud

Steven Pinker — a cherry-picking​ Panglossian

20 Aug, 2018 at 12:50 | Posted in Politics & Society | 5 Comments


Why most published research findings are false

19 Aug, 2018 at 10:45 | Posted in Statistics & Econometrics | Comments Off on Why most published research findings are false

Instead of chasing statistical significance, we should improve our understanding of the range of R values — the pre-study odds — where research efforts operate. Before running an experiment, investigators should consider what they believe the chances are that they are testing a true rather than a non-true relationship. Speculated high R values may sometimes then be ascertained … Large studies with minimal bias should be performed on research findings that are considered relatively established, to see how often they are indeed confirmed. I suspect several established “classics” will fail the test.

homer-stats-quoteNevertheless, most new discoveries will continue to stem from hypothesis-generating​ research with low or very low pre-study odds. We should then acknowledge that statistical significance testing in the report of a single study gives only a partial picture, without knowing how much testing has been done outside the report and in the relevant field at large. Despite a large statistical literature for multiple testing corrections, usually it is impossible to decipher how much data dredging by the reporting authors or other research teams has preceded a reported research finding. Even if determining this were feasible, this would not inform us about the pre-study odds. Thus, it is unavoidable that one should make approximate assumptions on howmany relationships are expected to be true among those probed across the relevant research fields and research designs.​

John P. A. Ioannidis

The rational expectations hoax

18 Aug, 2018 at 16:29 | Posted in Economics | Comments Off on The rational expectations hoax

how-many-irrational-assumptions-are-needed-for-economist-to-use-rational-expectationsA lot of mainstream economists still stick with ‘rational expectations’ since they think it has not yet been disconfirmed. They are, of course, entitled to have whatever views they like — after all, it is, to say the least, difficult to empirically disconfirm the non-existence of Gods …

But for the rest of us, let’s see how rational expectations really fare​ as an empirical assumption. Empirical efforts at testing the correctness​​s of the hypothesis have​​ resulted in a series of empirical studies that have more or less concluded that it is not consistent with the facts. In one of the more well-known and highly respected evaluation reviews made, Michael Lovell (1986) concluded:

it seems to me that the weight of empirical evidence is sufficiently strong to compel us to suspend belief in the hypothesis of rational expectations, pending the accumulation of additional empirical evidence.

And this is how Nikolay Gertchev summarizes studies on the empirical correctness of the hypothesis:

More recently, it even has been argued that the very conclusions of dynamic models assuming rational expectations are contrary to reality: “the dynamic implications of many of the specifications that assume rational expectations and optimizing behavior are often seriously at odds with the data” (Estrella and Fuhrer 2002, p. 1013). It is hence clear that if taken as an empirical behavioral assumption, the RE hypothesis is plainly false; if considered only as a theoretical tool, it is unfounded and self-contradictory​.

For more on the issue, permit me to self-indulgently recommend reading my article Rational expectations — a fallacious foundation for macroeconomics in a non-ergodic world in Real-World Economics Review no. 62.

« Previous PageNext Page »

Blog at
Entries and comments feeds.