Liturgy of St. John Chrysostom

21 October, 2018 at 17:15 | Posted in Varia | Leave a comment

 

Tchaikovsky’s masterpiece. Purity and grandeur. Poetry and spiritual beauty.

Advertisements

The Bayesian Trap

21 October, 2018 at 09:58 | Posted in Statistics & Econometrics | Leave a comment

 

Såna är vi Malmöbor!

21 October, 2018 at 09:52 | Posted in Varia | Leave a comment

ammoIbland känns det som om jag är med om saker som bara kan hända i Malmö. Exempelvis när en kille mitt på ljusa dagen, på ett falafelställe, vrålar “Jau ska hau en kebab! Baura kött! Inga grönsaker! Kött! Påmmes! Fetaost!” med en röst som skulle kunna tillhöra en livsfarlig politisk agitator — och ingen bryr sig nämnvärt. Alla i kön nickar snarare förstående “det där är en kille som vet vad han vill.”

Maria Maunsbach/SDS

Barfuß Am Klavier

20 October, 2018 at 11:51 | Posted in Varia | Leave a comment

 

jag på einstein-berlin1988
Yours truly. Café Einstein Berlin 1988. Photo by barnilsson

Bengt Nilsson In Memoriam (personal)

19 October, 2018 at 20:40 | Posted in Varia | Leave a comment

Idag har varit en av de tyngsta dagarna i mitt liv.

Min vän Bengt Nilsson begravdes idag på Norra Kyrkogården i Lund.

Inga ord kan riktigt beskriva vad man känner när ens bästa vän sedan mer än trettio år helt plötsligt inte längre finns där. Man minns skratten, glädjen, omtanken, den mäktiga intelligensen, den drastiska humorn, och det starka sociala patoset.

Livet går vidare — men kommer alltid att vara lite gråare, lite tristare, utan dig min vän.

OM MIG

20181019_2034091052437787.jpgJag heter Bengt Nilsson och är född 1956. Bloggens rubrik är helt enkelt initialerna till mina förnamn följda av nilsson, anledningen är att mitt ganska vanliga 5o-talsnamn redan var upptaget. Jag är medveten om att rubriken går att ändra, men insåg raskt dubbeltydigheten, vilken passar ganska väl med mitt liv i ungdomen.

Jag kommer från Kävlinge, en mindre ort i Skåne strax norr om Lund. Sedan 20-års-åldern är jag bosatt i Lund och har således inte gjort en särskilt stor geografisk omflyttning. Den mentala flytten är desto mer omfattande, eftersom det är svårt att tänka sig större motpoler i detta avseende.

Skoltiden i Kävlinge började i Harrie kommun och skolan i St. Harrie. När jag började sjuan och det socialdemokratiska välfärdsbyggets nya högstadium gjorde entré även i Kävlinge, blev det den nybyggda Korsbackaskolan. (Harrie inkorporerades i Kävlinge med sextiotalets kommunreform)

Gymnasiet tillbringades i Lund, Katedralskolans naturvetenskapliga linje.

Därefter använde jag universitetet som kunskapskälla, jag läste mängder av enstaka kurser utan någon ambition att göra karriär i den för mig motbjudande kapitalistiska “verkligheten”. När jag närmade mig de fyrtio skaffade jag mig en gymnasielärarutbildning, men sökte efter examen aldrig något arbete som lärare. Jag hade helt enkelt redan under det s.k. praktiskt-pedagogiska året fått nog av den degenererade gymnasieskolan och fick genom tur ett vanligt jobb.

Nedanstående är tester.

Testar ordnad lista
Testar rad två av ordnad lista
Testar om man kan skriva att Allianzregeringen är kass
Konstaterar att det gick alldeles utmärkt

Tro aldrig på en borgare
Tro aldrig på religioner
Tro aldrig på att numrerade listor följs av läsaren

Ekonomistas — allt från etablissemangsekonomers färdigtuggade skräpmat till läsvärda ariklar

18 October, 2018 at 21:34 | Posted in Education & School | 1 Comment

VlachosFör att utröna vilka effekter skolval har är det naturligtvis centralt att känna till vilken typ av skolor familjer föredrar. Det finns i princip två sätt att få kunskap kring detta: man kan fråga familjer vilken typ av skolor de uppskattar och man kan observera vilka skolor familjer faktiskt väljer. Vilken metod man väljer kan ge väldigt olika svar, vilket manifesteras i en debattartikel av Almega och Friskolornas Riksförbund som hävdar att obligatoriskt skolval är ett sätt att minska den sociala skolsegregationen.

Artikeln är baserad på en enkätundersökning som bland annat kommer fram till att föräldrar med svaga socioekonomiska förutsättningar värdesätter skolor med bra akademiska resultat särskilt högt. Socioekonomiskt starka familjer värderar däremot trivsel högre. Detta resultat påstås i rapportsammanfattningen vara i linje med vad den internationella forskningen visar, men att socioekonomiskt starka familjer skulle värdesätta skolors akademiska nivå lägre än socioekonomiskt svaga är minst sagt kontraintuitivt. Det kan därför vara bra att klargöra vad den internationella forskningen kring skolval egentligen kommit fram till.

En färsk studie från New York, som är under revidering för American Economic Review, finner att familjer söker sig till skolor (i detta fall high-schools) med högpresterande elever. Detta gäller för samtliga undersökta elevgrupper men resultaten visar att elever som själva är högpresterande är särskilt benägna att välja skolor med högpresterande elever. Däremot finns det inget som tyder på familjer skulle vara särskilt benägna att välja skolor som förvaltar sitt elevmaterial väl. Att en skola har ett högt ”förädlingsvärde” lockar alltså inte fler att välja den. Inte heller är familjer mer benägna att välja skolor som enligt studiens statistiska analys skulle passa just deras barn särskilt väl …

Liknande resultat går igen i andra studier. En välkänd sådan från North Carolina finner att familjer med stark socioekonomisk bakgrund är särskilt benägna att välja skolor med goda resultat, vilket alltså i hög grad fångar skolans elevsammansättning. Studien finner också att familjer från olika minoritetsgrupper har en tydlig preferens för skolor där respektive minoritetsgrupp är välrepresenterad. Lika söker alltså i hög grad lika.

Det finns även en svensk rapport som undersöker familjers faktiska skolval (lågstadiet). Hur en skola presterar på de nationella proven är inte särskilt viktigt medan avstånd till hemmet har betydelse. Vidare gör hög andel elever med utländsk bakgrund gör en skola mindre attraktiv, särskilt då för lågutbildade familjer med svensk bakgrund och högutbildade familjer med utländsk bakgrund. Skolor med många nyanlända ratas dock främst av svenska högutbildade familjer. Familjer med svensk bakgrund, speciellt de högutbildade, verkar även angelägna att välja skolor med socioekonomiskt starka elever, något som familjer med utländsk bakgrund verkar undvika. Generellt verkar familjer söka sig till skolor där eleverna tillhör samma socioekonomisk grupp som en själv. Lika söker återigen lika …

Snarare än att förlita sig på vad familjer uppger i enkäter är det önskvärt att studier av skolval baseras på hur familjer faktiskt väljer skolor. Det är trots allt vad familjer gör och inte vad de säger som i slutändan spelar roll för vilka konsekvenser skolvalet får. Dessutom spelar skolvalets utformning roll och ordningen med urval baserat på kötid till populära friskolor är med största sannolikhet något som förstärker skolvalets segregerande effekter. Oavsett skolvalets utformning måste dock slutsatsen från forskningen kring familjers preferenser vara att skolval knappast kan vara ett särskilt verksamt verktyg för att bryta den sociala skolsegregationen.

Jonas Vlachos

Som alltid initierat, relevant och intressant från en av våra få duktiga ekonomer som ägnar sig åt skolforskning.

Ekonomistas blandar allt från etablissemangsekonomers färdigtuggade skräpmat till synnerligen läsvärda artiklar av forskare som Jonas Vlachos.

Sovra och läs!

The connection between cause and probability

18 October, 2018 at 15:07 | Posted in Statistics & Econometrics | 2 Comments

hunt Causes can increase the probability​ of their effects; but they need not. And for the other way around: an increase in probability can be due to a causal connection; but lots of other things can be responsible as well …

The connection between causes and probabilities is like the connection between a disease and one of its symptoms: The disease can cause the symptom, but it need not; and the same symptom can result from a great many different diseases …

If you see a probabilistic dependence and are inclined to infer a causal connection from it, think hard about all the other possible reasons that that dependence might occur and eliminate them one by one. And when you are all done, remember — your conclusion is no more certain than your confidence that you really have eliminated all​ the possible alternatives.

Causality in social sciences — and economics — can never solely be a question of statistical inference. Causality entails more than predictability, and to really in-depth explain social phenomena require theory. Analysis of variation — the foundation of all econometrics — can never in itself reveal how these variations are brought about. First, when we are able to tie actions, processes or structures to the statistical relations detected, can we say that we are getting at relevant explanations of causation.

mediator“Mediation analysis” is this thing where you have a treatment and an outcome and you’re trying to model how the treatment works: how much does it directly affect the outcome, and how much is the effect “mediated” through intermediate variables …

In the real world, it’s my impression that almost all the mediation analyses that people actually fit in the social and medical sciences are misguided: lots of examples where the assumptions aren’t clear and where, in any case, coefficient estimates are hopelessly noisy and where confused people will over-interpret statistical significance …

More and more I’ve been coming to the conclusion that the standard causal inference paradigm is broken … So how to do it? I don’t think traditional path analysis or other multivariate methods of the throw-all-the-data-in-the-blender-and-let-God-sort-em-out variety will do the job. Instead we need some structure and some prior information.

Andrew Gelman

Most facts have many different, possible, alternative explanations, but we want to find the best of all contrastive (since all real explanation takes place relative to a set of alternatives) explanations. So which is the best explanation? Many scientists, influenced by statistical reasoning, think that the likeliest explanation is the best explanation. But the likelihood of x is not in itself a strong argument for thinking it explains y. I would rather argue that what makes one explanation better than another are things like aiming for and finding powerful, deep, causal, features and mechanisms that we have warranted and justified reasons to believe in. Statistical — especially the variety based on a Bayesian epistemology — reasoning generally has no room for these kinds of explanatory considerations. The only thing that matters is the probabilistic relation between evidence and hypothesis. That is also one of the main reasons I find abduction — inference to the best explanation — a better description and account of what constitute actual scientific reasoning and inferences.

In the social sciences … regression is used to discover relationships or to disentangle cause and effect. However, investigators have only vague ideas as to the relevant variables and their causal order; functional forms are chosen on the basis of convenience or familiarity; serious problems of measurement are often encountered.

Regression may offer useful ways of summarizing the data and making predictions. Investigators may be able to use summaries and predictions to draw substantive conclusions. However, I see no cases in which regression equations, let alone the more complex methods, have succeeded as engines for discovering causal relationships.

David Freedman

Some statisticians and data scientists think that algorithmic formalisms somehow give them access to causality. That is, however, simply not true. Assuming ‘convenient’ things like faithfulness or stability is not to give proofs. It’s to assume what has to be proven. Deductive-axiomatic methods used in statistics do no produce evidence for causal inferences. The real causality we are searching for is the one existing in the real world around us. If there is no warranted connection between axiomatically derived theorems and the real world, well, then we haven’t really obtained the causation we are looking for.

If contributions made by statisticians to the understanding of causation are to be taken over with advantage in any specific field of inquiry, then what is crucial is that the right relationship should exist between statistical and subject-matter concerns …
introduction-to-statistical-inferenceThe idea of causation as consequential manipulation is apt to research that can be undertaken primarily through experimental methods and, especially to ‘practical science’ where the central concern is indeed with ‘the consequences of performing particular acts’. The development of this idea in the context of medical and agricultural research is as understandable as the development of that of causation as robust dependence within applied econometrics. However, the extension of the manipulative approach into sociology would not appear promising, other than in rather special circumstances … The more fundamental difficulty is that​ under the — highly anthropocentric — principle of ‘no causation without manipulation’, the recognition that can be given to the action of individuals as having causal force is in fact peculiarly limited.

John H. Goldthorpe

Je größer der Dachschaden desto besser der Blick auf die Sterne …

18 October, 2018 at 11:22 | Posted in Politics & Society | Leave a comment

 

When odds ratios mislead (wonkish)

17 October, 2018 at 19:10 | Posted in Statistics & Econometrics | Leave a comment

A few years ago, some researchers from Georgetown University published in the New England Journal of Medicine a study that demonstrated systematic race and sex bias in the behavior of America’s doctors. Needless to say, this finding was widely reported in the media:

Relative_Risk_and_Odds_RatioWashington Post: “Physicians said they would refer blacks and women to heart specialists for cardiac catheterization tests only 60 percent as often as they would prescribe the procedure for white male patients.”

N.Y. Times: “Doctors are only 60% as likely to order cardiac catheterization for women and blacks as for men and whites.”

Now let’t try a little test of reading comprehension. The study found that the referral rate for white men was 90.6%. What was the referral rate for blacks and women?

If you’re like most literate and numerate people, you’ll calculate 60% of 90.6%, and come up with .6*.906 = .5436. So, you’ll reason, the referral rate for blacks and women was about 54.4 %.

But in fact, what the study found was a referral rate for blacks and women of 84.7%.

What’s going on?

It’s simple — the study reported an “odds ratio”. The journalists, being as ignorant as most people are about odds and odds ratios, reported these numbers as if they were ratios of rates rather than ratios of odds.

Let’s go through the numbers. If 90.6% of white males were referred, then 9.4% were not referred, and so a white male’s odds of being referred were 90.6/9.4, or about 9.6 to 1. Since 84.7% of blacks and women were referred, 13.3% were not referred, and so for these folks, the odds of referral were 84.7/15.3 ≅ 5.5 to 1. The ratio of odds was thus about 5.5/9.6, or about 0.6 to 1. Convert to a percentage, and you’ve got “60% as likely” or “60 per cent as often”.

The ratio of odds (rounded to the nearest tenth) was truly 0.6 to 1. But when you report this finding by saying that “doctors refer blacks and women to heart specialists 60% as often as they would white male patients”, normal readers will take “60% as often” to describe a ratio of rates — even though in this case the ratio of rates (the “relative risk”) was 84.7/90.6, or (in percentage terms) about 93.5%.

Mark Liberman

Ricardian vice

17 October, 2018 at 18:50 | Posted in Economics | 1 Comment

history-of-economic-analysis-schumpeter-first-edition Ricardo’s … interest was in the clear-cut result of direct, practical significance. In order to get this he cut that general system to pieces, bundled up as large parts of it as possible, and put them in cold storage — so that as many things as possible should be frozen and ‘given.’ He then piled one simplifying assumption upon another until, having really settled everything by theses assumptions, he was left with only a few aggregative variables between which, he set up simple one-way relations so that, in the end, the desire results emerged almost as tautologies … It is an excellent theory that can never be refuted and lacks nothing save sense. The habit of applying results of this character to the solution of practical problems we shall call the Ricardian Vice.

Sounds familiar, doesn’t it?

Only difference is that today it is seen as a virtue rather than a vice …

Ketchup economics

16 October, 2018 at 20:09 | Posted in Economics | Leave a comment

ketchup-food-768x1366

The increasing ascendancy of real business cycle theories of various stripes, with their common view that the economy is best modeled as a floating Walrasian equilibrium, buffeted by productivity shocks, is indicative of the depths of the divisions separating academic macroeconomists …

If  these theories are correct, they imply that the macroeconomics developed in the wake of the Keynesian Revolution is well confined to the ashbin of history. And they suggest that most of the work of contemporary macroeconomists is worth little more than that of those pursuing astrological science …

The appearance of Ed Prescott’ s stimulating paper, “Theory Ahead of Business Cycle Measurement,” affords an opportunity to assess the current state of real business cycle theory and to consider its prospects as a foundation for macroeconomic analysis …

My view is that business cycle models of the type urged on us by Prescott have nothing to do with the business cycle phenomena observed in The United States or other capitalist economies …

Prescott’s growth model is not an inconceivable representation of reality. But to claim that its parameters are securely tied down by growth and  micro observations seems to me a gross overstatement. The image of a big loose tent flapping in the wind comes to mind …

In Prescott’s model, the central driving force behind cyclical fluctuations is technological shocks. The propagation mechansim is intertemporal substitution in employment. As I have argued so far, there is no independent evidence from any source for either of these phenomena …

Imagine an analyst confronting the market for ketchup. Suppose she or he decided to ignore data on the price of ketchup. This would considerably increase the analyst’s freedom in accounting for fluctuations in the quantity of ketchup purchased … It is difficult to believe that any explanation of fluctuations in ketchup sales that did not confront price data would be taken seriously, at least by hard-headed economists.

Yet Prescott offers an exercise in price-free economics … Others have confronted models like Prescott’s to data on prices with what I think can fairly be labeled dismal results. There is simply no evidence to support any of the price effects predicted by the model …

Improvement in the track record of macroeconomics will require the development of theories that can explain why exchange sometimes work and other times breaks down. Nothing could be more counterproductive in this regard than a lengthy professional detour into the analysis of stochastic Robinson Crusoes.

Lawrence SummersSkeptical Observations on Real Business Cycle Theory 

I den stora sorgens famn (personal)

15 October, 2018 at 17:48 | Posted in Varia | Leave a comment


Med värme och kärlek tillägnad Ingrid, Anton och Iskra.

Vän! I förödelsens stund, när ditt inre av mörker betäckes,
När i ett avgrundsdjup minne och aning förgå,
Tanken famlar försagd bland skugggestalter och irrbloss,
Hjärtat ej sucka kan, ögat ej gråta förmår;
När från din nattomtöcknade själ eldvingarne falla,
Och du till intet, med skräck, känner dig sjunka på nytt,
Säg, vem räddar dig då?- Vem är den vänliga ängel,
Som åt ditt inre ger ordning och skönhet igen,
Endast det heliga Ord, som ropte åt världarna: “Bliven!”
Och i vars levande kraft världarna röras ännu.
Därföre gläds, o vän, och sjung i bedrövelsens mörker:
Natten är dagens mor, Kaos är granne med Gud.

Halcyon days (personal)

15 October, 2018 at 09:56 | Posted in Varia | 1 Comment

 

Spending some lovely Indian​ Summer days at our summer residence in the Karlskrona​ archipelago. Pure energy for the soul.

Oft Gefragt

14 October, 2018 at 23:13 | Posted in Varia | Leave a comment

 

Zu Hause bist immer nur du
Zu Hause bist immer nur du
Du hast mich abgeholt und hingebracht
Bist mitten in der Nacht wegen mir aufgewacht
Ich hab in letzter Zeit so oft daran gedacht
Hab keine Heimat, ich hab nur dich
Du bist zu Hause für immer und mich

Too much of ‘we controlled for’

14 October, 2018 at 12:12 | Posted in Statistics & Econometrics | Leave a comment

The gender pay gap is a fact that, sad to say, to a non-negligible extent is the result of discrimination. And even though many women are not deliberately discriminated against, but rather self-select into lower-wage jobs, this in no way magically explains away the discrimination gap. As decades of socialization research has shown, women may be ‘structural’ victims of impersonal social mechanisms that in different ways aggrieve them. Wage discrimination is unacceptable. Wage discrimination is a shame.

You see it all the time in studies. “We controlled for…” And then the list starts. The longer the better. Income. Age. Race. Religion. Height. Hair color. Sexual preference. Crossfit attendance. Love of parents. Coke or Pepsi. The more things you can control for, the stronger your study is — or, at least, the stronger your study seems. Controls give the feeling of specificity, of precision. But sometimes, you can control for too much. Sometimes you end up controlling for the thing you’re trying to measure …

paperAn example is research around the gender wage gap, which tries to control for so many things that it ends up controlling for the thing it’s trying to measure. As my colleague Matt Yglesias wrote:

“The commonly cited statistic that American women suffer from a 23 percent wage gap through which they make just 77 cents for every dollar a man earns is much too simplistic. On the other hand, the frequently heard conservative counterargument that we should subject this raw wage gap to a massive list of statistical controls until it nearly vanishes is an enormous oversimplification in the opposite direction. After all, for many purposes gender is itself a standard demographic control to add to studies — and when you control for gender the wage gap disappears entirely!” …

Take hours worked, which is a standard control in some of the more sophisticated wage gap studies. Women tend to work fewer hours than men. If you control for hours worked, then some of the gender wage gap vanishes. As Yglesias wrote, it’s “silly to act like this is just some crazy coincidence. Women work shorter hours because as a society we hold women to a higher standard of housekeeping, and because they tend to be assigned the bulk of childcare responsibilities.”

Controlling for hours worked, in other words, is at least partly controlling for how gender works in our society. It’s controlling for the thing that you’re trying to isolate.

Ezra Klein

Trying to reduce the risk of having established only ‘spurious relations’ when dealing with observational data, statisticians and econometricians standardly add control variables. The hope is that one thereby will be able to make more reliable causal inferences. But — as Keynes showed already back in the 1930s when criticizing statistical-econometric applications of regression analysis — if you do not manage to get hold of all potential confounding factors, the model risks producing estimates of the variable of interest that are even worse than models without any control variables at all. Conclusion: think twice before you simply include ‘control variables’ in your models!

Bayesian networks and causal diagrams

14 October, 2018 at 09:09 | Posted in Theory of Science & Methodology | 3 Comments

36393702Whereas a Bayesian network​ can only tell us how likely one event is, given that we observed another, causal diagrams can answer interventional and counterfactual questions. For example, the causal fork A <– B –> C tells us in no uncertain terms that wiggling A would have no effect on C, no matter how intense the wiggle. On the other hand, a Bayesian network is not equipped to handle a ‘wiggle,’ or to tell the difference between seeing and doing, or indeed to distinguish a fork from a chain [A –> B –> C]. In other words, both a chain and a fork would predict that observed changes in A are associated with changes in C, making no prediction about the effect of ‘wiggling’ A.

Healing my wounded soul (personal)

13 October, 2018 at 23:11 | Posted in Varia | Leave a comment

 

Mobile detox

13 October, 2018 at 17:10 | Posted in Varia | Leave a comment

Eton College is the latest in a series of schools to crack down on mobile phone use among their pupils. Last year, the £39,000-a-year Brighton College started forcing students to hand in their mobile phones at the beginning of each day in an effort to wean them off their “addiction” to technology.

An Overview Of Eton CollegeStudents in year seven, eight and nine are now required to hand in their mobile phones at the beginning of the day to teachers who will lock it away, ready for collection when they are about the go home.

Students in year ten are allowed their phones, but must subscribe to three “detox” days a week where they hand it in, with year elevens having one “detox” day.

At Wimbledon High School, a fee-paying day school in south-west London, all children and parents are given a copy of the schools’ digital rules, one of which is “put your phone away at meals and leave your phone downstairs at bedtime – try and be screen free at least an hour before bed”.

The Telegraph

In my days it used to be sex, drugs, and rock ‘n’ roll. And now we have to protect our kids from the dangers of mobile phone addiction …

Does using models really make economics a science?​

12 October, 2018 at 17:25 | Posted in Economics | Leave a comment

The model has more and more become the message in modern mainstream economics. Formal models are said to help achieve ‘clarity’ and ‘consistency.’ Dani Rodrik — just to take one prominent example — even​ says, in his Economics Rules, that “models make economics a science.”

bbEconomics is more than any other social science model-oriented. There are many reasons for this — the history of the discipline, having ideals coming from the natural sciences (especially physics), the search for universality (explaining as much as possible with as little as possible), rigour, precision, etc.

Mainstream economists want to explain social phenomena, structures and patterns, based on the assumption that the agents are acting in an optimizing (rational) way to satisfy given, stable and well-defined goals.

The procedure is analytical. The whole is broken down into its constituent parts so as to be able to explain (reduce) the aggregate (macro) as the result of interaction of its parts (micro).

Modern mainstream economists ground their models on a set of core assumptions — basically describing the agents as ‘rational’ actors — and a set of auxiliary assumptions. Together they make up the base model of all mainstream economic models. Based on these two sets of assumptions, they try to explain and predict both individual (micro) and — most importantly — social phenomena (macro).

When describing the actors as rational in these models, the concept of rationality used is instrumental rationality – choosing consistently the preferred alternative, which is judged to have the best consequences for the actor given his in the model exogenously given wishes/interests/goals. How these preferences/wishes/interests/goals are formed is typically not considered to be within the realm of rationality, and a fortiori not constituting part of economics proper.

The picture given by the set of core assumptions (rational choice) is a rational agent with strong cognitive capacity that knows what alternatives she is facing, evaluates them carefully, calculates the consequences and chooses the one — given her preferences — that she believes has the best consequences according to him.

Weighing the different alternatives against each other, the actor makes a consistent optimizing choice and acts accordingly (given the set of auxiliary assumptions that specify the kind of social interaction between ‘rational actors’ that can take place in the model).

So — mainstream economic models basically consist of a general specification of what (axiomatically) constitutes optimizing rational agents and a more specific description of the kind of situations in which these rational actors act. The list of assumptions can never be complete since there will always unspecified background assumptions and some (often) silent omissions (like closure, transaction costs, etc). The hope, however, is that the ‘thin’ list of assumptions shall be sufficient to explain and predict ‘thick’ phenomena in the real, complex, world.

Economics — in contradistinction to logic and mathematics — ought to be an empirical science, and empirical testing of ‘axioms’ ought to be self-evidently relevant for such a discipline. For although the mainstream economist himself (implicitly) claims that his axioms are universally accepted as true and in no need of proof, that is in no way a justified reason for the rest of us to simpliciter accept the claim.

When applying deductivist thinking to economics, mainstream economists usually set up ‘as if’ models based on the logic of idealization and a set of tight axiomatic assumptions from which consistent and precise inferences are made. The beauty of this procedure is that if the axiomatic premises are true, the conclusions necessarily follow. But — although the procedure is a marvellous tool in mathematics and axiomatic-deductivist systems, it is a poor guide for the real world.

The way axioms and theorems are formulated in mainstream economics standardly leaves their specification without almost any restrictions whatsoever, safely making every imaginable evidence compatible with the all-embracing ‘theory’ — and a theory without informational content never risks being empirically tested and found falsified. Used in mainstream economics ‘thought experimental’ activities, it may, of course, be very ‘handy’, but totally void of any empirical value.

Mainstream economic models are nothing but broken pieces models. That kind of models can’t make economics a science

Paul Romer — a flamboyant and hot-headed economist

12 October, 2018 at 09:12 | Posted in Economics | Leave a comment

L’Américain Paul Romer, qui s’est vu décerner lundi le prestigieux prix Nobel d’économie aux côtés d’un de ses compatriotes William Nordhaus, est un économiste flamboyant à la carrière mouvementée, connu pour ses travaux mesurant la part de l’innovation dans la croissance.

mathiness-in-the-theory-of-economic-growth-paul-romer2A 62 ans, il est actuellement professeur à l’Université de New York … Il avait quitté en octobre 2016 le monde universitaire pour occuper le poste de chef économiste de la Banque mondiale (BM). Mais ses critiques à peine voilées de l’institution de Washington l’ont contraint à démissionner en janvier dernier et il est retourné à ses travaux académiques à New York.

En cause, ses prises de position sur un rapport phare de la BM, “Doing business”, publié chaque année, qui passe au crible le cadre réglementaire s’appliquant aux PME dans 190 économies pour évaluer quels sont les pays les plus favorables au lancement d’une entreprise.

Paul Romer laisse entendre que ce classement est influencé par des considérations politiques, citant un changement de méthodologie pénalisant par exemple le Chili, qui, depuis 2013, dégringole dans le classement uniquement par un effet mécanique.

Avant cela, le bouillant économiste avait déjà suscité la polémique avec un article retentissant, “The trouble with macroeconomics”, dans lequel il critiquait ses collègues macroéconomistes, leur reprochant de “faire tourner” des modèles mathématiques sans rapport avec le réel.

Selon lui, considérer la connaissance et l’information comme une ressource crée de la croissance économique. Contrairement aux autres ressources, la connaissance n’est pas seulement abondante, elle est infinie …

Bien que son nom ait été cité plusieurs fois parmi les potentiels lauréats du Nobel, Paul Romer a expliqué qu’il n’avait pas décroché son téléphone aux premiers coups de fil reçus au petit matin, croyant à des appels commerciaux. C’était l’Académie royale des sciences.

“Je ne le voulais pas, mais je l’accepte”, a-t-il alors déclaré, dans un sourire.

La Croix

Som sommaren

11 October, 2018 at 23:25 | Posted in Varia | Leave a comment

 

Paul Romer’s endogenous growth theory — a very short introduction

10 October, 2018 at 21:38 | Posted in Economics | 1 Comment

 
e4fbf90be584f8c8aa8f6070f1369708

The British school system​ — damaging working-class children

10 October, 2018 at 17:49 | Posted in Education & School | Leave a comment

While on the surface middle and working class children appear to be receiving the same comprehensive education, in some​e cases while attending the same schools, the entrenchment of policies of choice and excessive testing, assessment, sorting and sifting mean that they are increasingly educated apart as they move through the school system. The divide in English education is not just between state and private, but also within the state sector itself.Diane-Reay-featuredIn the past the barriers to realising working class educational potential came through failing the 11-plus and being consigned to schools seen to be second-rate, or being relegated to the bottom stream of a grammar school. Now OECD research shows that middle class students tend to be taught in smaller classes and have access to better quality teaching resources than their working class peers. Although nominally receiving the same education as middle class students who attend state schools, working class children are subject to a narrowing of the curriculum and a degree of teaching to the test that is not experienced by their middle class peers. They are also more likely to be taught by inexperienced teachers and to experience a higher turnover of staff. The historical legacy of different education for different classes still overshadows the state sector. But the inequalities it perpetuates have been reinforced by the neoliberal drive towards markets, competition, regulation, and individualism.

Diane Reay

Vad ska vi göra nu när dom som skulle visa vägen har tappat kompassen?

9 October, 2018 at 19:00 | Posted in Varia | 1 Comment


Ja, det är nog en berättigad fråga att ställa i denna kunskapsrelativismens era där många gamla vänsterradikaler blivit postmoderna socialkonstruktivister och tror att världen ska ändras med tyckmyckentrutat ordbajseri …

Med en posthumanistisk ansats belyser och reflekterar jag över hur både människa och häst överskrider sina varanden och hur det öppnar upp ett mellanrum med dimensioner av subjektivitet, kroppslighet och ömsesidighet.

elite-daily-sleeping-student

Paul Romer’s critique of ‘post-real’ economics

9 October, 2018 at 09:13 | Posted in Economics | Leave a comment

 blah_blahIn practice, what math does is let macro-economists locate the FWUTVs [facts with unknown truth values] farther away from the discussion of identification … Relying on a micro-foundation lets an author say, “Assume A, assume B, …  blah blah blah … And so we have proven that P is true. Then the model is identified.” …

Distributional assumptions about error terms are a good place to bury things because hardly anyone pays attention to them. Moreover, if a critic does see that this is the identifying assumption, how can she win an argument about the true expected value the level of aether? If the author can make up an imaginary variable, “because I say so” seems like a pretty convincing answer to any question about its properties.

Paul Romer

Yes, indeed, modern mainstream economics — and especially its mathematical-statistical operationalization in the form of econometrics — fails miserably over and over again. One reason why it does, is that the error term in the regression models used is thought of as representing the effect of the variables that were omitted from the models. The error term is somehow thought to be a ‘cover-all’ term representing omitted content in the model and necessary to include to ‘save’ the assumed deterministic relation between the other random variables included in the model. Error terms are usually assumed to be orthogonal (uncorrelated) to the explanatory variables. But since they are unobservable, they are also impossible to empirically test. And without justification of the orthogonality assumption, there is, as a rule, nothing to ensure identifiability.

In mainstream econometrics, the error term is usually portrayed as representing the combined effect of the variables that are omitted from the model. What one does not say — in a way bordering on intellectual dishonesty — is that this assumption only works when (1) the combined effect is independent of each and every variable included in the model, and (2) the expectational value of the combined effect equals zero. And that is something almost never fulfilled in real-world settings!

‘Modern’ mainstream economics is based on the belief that deductive-axiomatic modelling is a sufficient guide to truth. That belief is, however, totally unfounded as long as no proofs are supplied for us to believe in the assumptions on which the model-based deductions and conclusions build. ‘Mathiness’ masquerading as science is often used by mainstream economists to hide the problematic character of the assumptions used in their theories and models. But — without showing the model assumptions to be realistic and relevant, that kind of economics indeed, as Romer puts it, produces nothing but “blah blah blah.”

Without strong evidence, all kinds of absurd claims and nonsense may pretend to be science. Using math can never be a substitute for thinking. Or as Romer has it in his showdown with ‘post-real’ economics:

Math cannot establish the truth value of a fact. Never has. Never will.

At last — Paul Romer got his ‘Nobel prize’​

8 October, 2018 at 14:29 | Posted in Economics | 9 Comments

Among Swedish economists, Paul Romer has for many years been the favourite candidate for receiving the ‘Nobel Prize’ in economics. This year the prediction turned out right. Romer got the prize (together with William Nordhaus).

The ‘Nobel prize’ in economics has almost exclusively gone to mainstream economists, and most often to Chicago economists. So how refreshing it is that we for once have a winner who has been brave enough to openly criticize the ‘post-real’ things that emanate from the Chicago ivory tower!

Adam Smith once wrote that a really good explanation is “practically seamless.”

Is there any such theory within one of the most important fields of social sciences — economic growth?

Paul Romer‘s theory presented in Endogenous Technological Change (1990) – where knowledge is made the most important driving force of growth – is probably as close as we get.

Knowledge – or ideas – are according to Romer the locomotive of growth. But as Allyn Young, Piero Sraffa and others had shown already in the 1920s, knowledge is also something that has to do with increasing returns to scale and therefore not really compatible with neoclassical economics with its emphasis on decreasing returns to scale.

Increasing returns generated by nonrivalry between ideas is simply not compatible with pure competition and the simplistic invisible hand dogma. That is probably also the reason why neoclassical economists have been so reluctant to embrace the theory whole-heartedly.

Neoclassical economics has tried to save itself by more or less substituting human capital for knowledge/ideas. But Romer’s pathbreaking ideas should not be confused with human capital. Although some have problems with the distinction between ideas and human capital in modern endogenous growth theory, this passage from Romer’s article The New Kaldor Facts: Ideas, Institutions, Population, and Human Capital gives a succinct and accessible account of the difference:

Of the three state variables​ that we endogenize, ideas have been the hardest to bring into the applied general equilibrium structure. The difficulty arises because of the defining characteristic of an idea, that it is a pure nonrival good. A given idea is not scarce in the same way that land or capital or other objects are scarce; instead, an idea can be used by any number of people simultaneously without congestion or depletion.

Because they are nonrival goods, ideas force two distinct changes in our thinking about growth, changes that are sometimes conflated but are logically distinct. Ideas introduce scale effects. They also change the feasible and optimal economic institutions. The institutional implications have attracted more attention but the scale effects are more important for understanding the big sweep of human history.

The distinction between rival and nonrival goods is easy to blur at the aggregate level but inescapable in any microeconomic setting. Picture, for example, a house that is under construction. The land on which it sits, capital in the form of a measuring tape, and the human capital of the carpenter are all rival goods. They can be used to build this house but not simultaneously any other. Contrast this with the Pythagorean Theorem, which the carpenter uses implicitly by constructing a triangle with sides in the proportions of 3, 4 and 5. This idea is nonrival. Every carpenter in the world can use it at the same time to create a right angle.

Of course, human capital and ideas are tightly linked in production and use. Just as capital produces output and forgone output can be used to produce capital, human capital produces ideas and ideas are used in the educational process to produce human capital. Yet ideas and human capital are fundamentally distinct. At the micro level, human capital in our triangle example literally consists of new connections between neurons in a carpenter’s head, a rival good. The 3-4-5 triangle is the nonrival idea. At the macro level, one cannot state the assertion that skill-biased technical change is increasing the demand for education without distinguishing between ideas and human capital.

Paul’s idea about ideas is well worth a ‘Nobel Prize.’ Congratulations Paul!

Relative und absolute Risiken

8 October, 2018 at 09:03 | Posted in Statistics & Econometrics | Leave a comment

relative-vs-absolute2-3Das absolute Risiko einer medizinischen Intervention unterscheidet sich dabei vom relativen Risiko. Das wird an einem Beispiel klar: Angenommen, bei einem Test kann durch ein Medikament die Zahl der Erkrankungen von 10 auf 5 Fälle bei 1000 Personen reduziert werden. Relativ gesehen ist das eine Reduzierung des Krankheitsrisikos um 50 Prozent (5 von 10). Absolut gesehen sind es jedoch nur 0,5 Prozent (5 von 1000). Die erste Zahl wird vermutlich die Firma, die das Medikament vermarkten möchte, bevorzugt verwenden. Die zweite Zahl ist aber wesentlich aussagekräftiger, da sie die Gesamtheit aller Fälle berücksichtigt.

Die Angabe von relativen Risiken findet man leider viel zu oft in Medienberichten. Kein Wunder, denn sie klingen im Allgemeinen viel spektakulärer. Wenn wir wirklich über Gefahren oder Erfolge Bescheid wissen wollen, sollten wir nach den absoluten Zahlen suchen. Vor allem jedoch sollten wir uns immer der Tatsache bewusst sein, dass wir Wahrscheinlichkeiten und Risiken nicht ohne Weiteres korrekt einschätzen können.

Florian Freistetter

Is economic consensus a good thing?

7 October, 2018 at 13:51 | Posted in Theory of Science & Methodology | 1 Comment

No, it is not — and here’s one strong reason why:

rosenThe mere existence of consensus is not a useful guide. We should ask; Does a consensus have its origins and its ground in a rational and comprehensive appraisal of substantial evidence? Has the available evidence been open to vigorous challenge, and has it met each challenge? … A consensus that lacks these origins is of little consequence precisely because it lacks these origins. Knowing the current consensus is helpful in forecasting a vote; having substantial evidence​ is helpful in judging what​ is true. That something is standardly believed or assumed is not, by itself, a reason to believe or assume it. Error and confusion are standard conditions of the human mind.

Dagens humaniora — identitetspolitisk mumbo jumbo

7 October, 2018 at 12:29 | Posted in Politics & Society | 1 Comment

Nu har det hänt igen. Tre amerikanska akademiker har ägnat ett helt år åt att producera falska artiklar, som de sedan fått publicerade i ledande Cultural Studies-tidskrifter …

7e1cb63b-7a0b-43eb-b8bf-3b807ded430eGemensamt för de falska artiklarna är att de driver frågor om förtryck, maktrelationer och identitetspolitik, men skruvat på ett sätt som inte borde ha passerat normal vetenskaplig granskning. Det kanske mest häpnadsväckande exemplet är att dessa wallraffande forskare lyckades få en lätt omskriven version av ett kapitel ur Adolf Hitlers Mein Kampf publicerad som en vetenskaplig artikel om könsförtryck …

Hur hamnade vi här? En förklaring är att den så kallade identitetspolitiken kommit att spela en allt större roll inom humanistisk forskning. Att hitta dolda maktstrukturer och skriva om historien utifrån förtryckta gruppers perspektiv har, påhejat av ledande politiker, blivit högsta mode inom akademin …

Humaniora har över tid förändrats i grunden. De humanistiska ämnenas uppgift var från början att beskriva de eviga idéerna, det vackra, det goda, det rätta, det sanna. Dessa idéer blev snart till nationella idéer. Men med moderniteten kom de att brytas ned av vetenskaplig analys. En analys som slutligen fick se sig själv upplöst av förnuftskritik. Kvar står man idag med den egna identiteten, kompisnätverken och en inte sällan pseudoradikal hållning som enda grund. Utan tro på sanningen. Men besatt av den makt man långsamt förlorat.

Håkan Boström/GP

Det här är en artikel som akademiska pseudoradikaler — av vilka en del är gamla vänner och bekanta till yours truly — så klart inte gillar.

Men Boström har ju rätt!

Mycket av dagens humanistiska forskning är meningslöst nonsens. Anledningarna till detta är många, men en viktig faktor är att det inom akademin nuförtiden är så viktigt att producera mycket snarare än bra och betydelsefull forskning. En riktigt bra bok eller artikel väger lätt mot tio mer eller mindre irrelevanta socialkonstruktivistiska och identitetspolitiska nonsensartiklar publicerade i någon ‘vetenskaplig’ tidskrift när man söker tjänster eller försöker meritera sig. Resutaltet blir att det i bästa fall är en av hundra artiklar som läses av mer än de närmast sörjande och har någon i verklig mening nytt, intressant och betydelsefullt att säga. Resten är nonsens som hör hemma i papperskorgen.

The American Time Zone

7 October, 2018 at 12:03 | Posted in Varia | 1 Comment

 
timezone

Next Page »

Blog at WordPress.com.
Entries and comments feeds.