Svenska Dagbladet ljuger om friskolorna

31 March, 2013 at 20:25 | Posted in Education & School | 8 Comments

På ledarsidan i Svenska Dagbladet skriver idag dess politiske chefredaktör att “det finns mycket” som pekar i riktning mot att fler friskolor leder till bättre resultat. Under åberopande av en studie genomförd av Anders Böhlmark och Mikael Lindahl påstås också att “[ä]ven elever i kommunala skola gör bättre ifrån sig tack vare konkurrensen.”

Det låter ju jättefint. Problemet är bara att det är ljug!

Låt mig förklara varför jag anser att Svenska Dagbladet ljuger om friskolorna och samtidig reda ut vad forskning och data verkligen säger om friskolornas effekter på skolors och elevers resultat.
Läs vidare

Advertisements

Trickle-down – the USA and Sweden show how it looks in reality

30 March, 2013 at 17:44 | Posted in Economics, Politics & Society | 4 Comments

In a post up today on his blog, Paul Krugman notices that there “doesn’t seem to be much trickle-down going on” in the USA.

Unfortunately we can see the same pattern developing in Sweden. Look at the figure below, which shows how the distribution of mean income and wealth (expressed in year 2009 prices) for the top 0.1% and the bottom 90% has changed in Sweden for the last 30 years:


Source: The World Top Incomes Database

A society where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implodes. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed. It’s high time to put an end to this the worst Juggernaut of  our times!

Friskolorna och segregationen

26 March, 2013 at 14:12 | Posted in Education & School | 13 Comments

Förre chefredaktören och friskoleivraren Hans Bergström hade häromdagen en synnerligen illa underbyggd artikel på newsmill där han påstod att svenska friskolor minskar segregationen.

Sten Svensson – tidigare chefredaktör på Lärarnas tidning – svarar på detta grodors plums och ankors plask med ett välargumenterat inlägg i dagens newsmill:

segregeringUnder sin tid som chefredaktör för Dagens Nyheter bedrev Hans Bergström en exempellös svartmålning av den svenska skolan. Den var slapp, flummig kunskapsfientlig och usel på alla upptänkliga sätt …

Nu har vi precis den skola som Bergström har strävat efter och ändå är det inte bra. Elevresultaten faller och allt flera forskarrapporter visar att den svenska skolan är på väg åt helt fel håll …

Bergström försöker bevisa att den svenska skolan inte är segregerad och att segregationen inte har ökat på grund det fria valet och de fristående skolorna. Han talar mot bättre vetande.

I Skolverkets rapport om skolans likvärdighet, som kom förra året, kan man läsa:

”Spridningen mellan skolors genomsnittliga resultat har ökat markant. Mellanskolsvariationen, det mått som används för att beskriva hur mycket resultaten skiljer sig mellan olika skolor, har från en i ett internationellt perspektiv låg nivåer mer än fördubblats sedan slutet av 1990-talet. 2011 låg mellanskolsvariation på över 18 procent enligt betygen. De internationella PISA-undersökningarna visar på samma utveckling. Även spridningen i resultat mellan elever har ökat men inte i samma omfattning som ökningen mellan skolor.” (Likvärdig utbildning i svensk grundskola? Rapport 374. 2012.)

Enligt skolverket har de ökade skillnaderna flera orsaker. En del förklaras med att eleverna är segregerade efter socioekonomiska grunder och en annan del förklaras av att eleverna segregeras efter grunder som inte syns i statistiken.

”Skolorna verkar däremot bli mer och mer segregerade efter egenskaper som inte visar sig i den vanliga statistiken, till exempel att mer studiemotiverade elever (oavsett socioekonomisk bakgrund) i större utsträckning tenderar att utnyttja det fria skolvalet och söka sig till skolor där det finns många andra studiemotiverade elever. På så sätt blir eleverna mer sorterade efter resultat och dolda egenskaper än efter konventionella mått på socioekonomisk bakgrund.”

Skolverket bedömer att likvärdigheten i den svenska grundskolan har försämrats och att ”Valfrihets- och decentraliseringsreformerna i början av 1990-talet har med stor sannolikhet bidragit till denna utveckling även om andra faktorer också kan ha spelat en viss roll.”

Efter sin tid som chefredaktör för Dagens Nyheter har Hans Bergström varit verksam inom friskolebranschen som delägare i ett skolbolag. Han tillhör därmed den grupp som tjänar pengar på en segregerad skola. Skolbolagens vinster kommer till stor del från att de har låg lärartäthet. Om aktiebolagsskolorna, trots låg lärartäthet, har hyfsade elevresultat beror det på att de har ett positivt elevurval. Eleverna har i större grad välutbildade föräldrar som kan stötta sina barn på olika sätt. Som skolverket visar ovan får de också välmotiverade och försigkomna elever som inte har välutbildade föräldrar. Skulle bolagsskolarna ha samma elevunderlag som en ordinär förortskola vore det omöjligt att få goda elevresultat med låg lärartäthet. Skolbolagens vinster kräver således en positivt segregerad skola och vinsterna driver på skolsegregationen.

Hans Bergström kritiserade hårt den gamla sammanhållna och likvärdiga skolan och hans mål var att avskaffa den. Nu är det genomfört och den likvärdiga skolan är avskaffad. Det har varit bra för Hans Bergström och friskolebranschen som kunnat tjäna pengar på systemet men för skolan och eleverna har utvecklingen varit katastrofal.

När det gäller frågan om friskolornas effekt på den svenska skolans kvalitet pekar yours truly i en artikel på Skola och samhälle (längre version här) att det mesta av forskningen tyder på att det inte går att fastslå att friskolorna haft en positiv inverkan.

Så segregationen ökar och kvaliteten ökar inte – vad ska vi då ha friskolorna för? Att bolagsskolornas ägare och direktörer skor sig på våra skattepengar är räcker inte som argument.

Of what use are RCTs?

25 March, 2013 at 13:00 | Posted in Theory of Science & Methodology | 1 Comment

In this video science philosopher Nancy Cartwright explains why using Randomized Controlled Trials (RCTs) is not at all the “gold standard” that it has lately often been portrayed as. control-group1As yours truly has repeatedly argued on this blog (e.g. here and here), RCTs usually do not provide evidence that their results are exportable to other target systems. The almost religious belief with which its propagators portray it, cannot hide the fact that RCTs cannot be taken for granted to give generalizable results. That something works somewhere is no warranty for it to work for us or even that it works generally.

Why the notion of austerity is a massive failure

25 March, 2013 at 10:07 | Posted in Economics | 1 Comment

 

(h/t Jan Milch)

Om vikten av att skilja på risk och osäkerhet

24 March, 2013 at 18:07 | Posted in Economics | 2 Comments

riple På en punkt är det särskilt angeläget att uppgradera den finansiella kunskapen, och det är i förståelsen av osäkerhet …

Läroboken säger i praktiken att vi kan sopa osäkerheten under mattan. När den påstår att vi kan “precisera en sannolikhets-fördelning” försöker den om-vandla osäkerheten till risk …

Det är dags att inse hur farlig denna syn är. Finansmarknaderna är osäkra. Osäkerheten är inte kvantifierbar. Att försöka släta över osäkerheten och prata om risk som ett statistiskt mått är ansvarslöst, för i den sekund man formulerar en risksiffra är det många som börjar tro på den.

Bäst uttrycker kanske ändå postkeynesianernas grand old man – Paul Davidson – vikten och betydelsen av att som John Maynard Keynes (för att nu inte nämna Frank Knight och Gunnar Myrdal) skilja på probabilistisk risk och genuin osäkerhet:

uncertaintyroadUnfortunately as we have all learned in the world of experience, little is known with certainty about future payoffs of investment decisions made today. If the return on economic decisions made today is never known with certainty, then how can financial managers make optimal decisions on where to put their firm’s money and householder’s where to put their saving today?

If theorists invent a world remote from reality and then lived in it consistently, then Keynes [1936, p.16] argued these economic thinkers were “like Euclidean geometers in a non-Euclidean world who discover that apparent parallel lines collide, rebuke these lines for not keeping straight. Yet, in truth there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics” …

As any statistician will tell you, in order to draw any statistical (probabilistic risk) inferences regarding the values of any population universe, one should draw and statistically analyze a sample from that universe. Drawing a sample from the future economic universe of financial markets, however, is impossible. Simply stated the ergodic axiom presumes that the future is already predetermined by an unchanging probability distribution and therefore a sample from the past is equivalent to drawing a sample from the future … Assuming ergodicity permits one to believe one can calculate an actuarial certainty about future events from past data.

Efficient market theorists must implicitly presume decision makers can reliably calculate the future. The economy, therefore, must be governed by an ergodic stochastic process, so that calculating a probability distribution from past statistical data samples is the same as calculating the risks from a sample drawn from the future. If financial markets are governed by the ergodic axiom, then we might ask why do mutual funds that advertise their wonderful past earnings record always note in the advertisement that past performance does not guarantee future results …

This ergodic axiom is an essential foundation for all the complex risk management computer models developed by the “quants” on Wall Street. It is also the foundation for econometricians who believe that their econometric models will correctly predict the future GDP, employment, inflation rate, etc. If, however, the economy is governed by a non-ergodic stochastic process, then econometric estimates generated from past market data are not reliable estimates that would be obtained if one could draw a sample from the future …

In sum, the ergodic axiom underlying the typical risk management and efficient market models represents, in a Keynes view, a model remote from an economic reality that is truly governed by non-ergodic conditions. Keynes, his Post Keynesian followers, and George Soros all reject the assumption that people can know the economic future since it is not predetermined. Instead they assert that people “know” they cannot know the future outcome of crucial economic decisions made today. The future is truly uncertain and not just probabilistic risky.

Paul Davidson

How to be more European

24 March, 2013 at 10:25 | Posted in Varia | 1 Comment

 
we-could-be-unemployed

Sunday morning and The Telegraph obits

24 March, 2013 at 09:34 | Posted in Varia | Comments Off on Sunday morning and The Telegraph obits

One of yours truly’s Sunday morning rituals is reading the obituary column of The Telegraph. Today this obit caught my eyes:

Peter Scott, who has died aged 82, was a highly accomplished cat burglar, and as Britain’s most prolific plunderer of the great and good took particular pains to select his victims from the ranks of aristocrats, film stars and even royalty.

Peter Scott, 'King of thee Cat Burglers'According to a list of 100 names he supplied to The Daily Telegraph, he targeted figures such as Soraya Khashoggi, Shirley MacLaine, the Shah of Iran, Judy Garland and even Queen Elizabeth the Queen Mother — although he added apologetically that, in her case, the authorities had covered up by issuing a “D-notice ”.

In 1994 Scott wrote to the newspaper to say that he would consider it “a massive disappointment if I were not to get a mention in [its] illustrious obituary column”. He explained that he derived much pleasure from reading accounts of the exploits of war heroes, adding: “I would like to think I would have fronted the Hun with the same enthusiasm as I did the fleshpots in Mayfair.” He added that he had been a Telegraph reader since 1957, when newspapers were first allowed in prisons, “on account of its broad coverage on crime”.

In the course of thieving jewellery and artworks from Mayfair mansions, Bond Street shops and stately homes, Scott also served Fleet Street as handy headline fodder, being variously hailed the “King of the Cat Burglars”, “Burglar to the Stars” or the “Human Fly”. He identified a Robin Hood streak in himself, too, asserting in his memoirs that he had been “sent by God to take back some of the wealth that the outrageously rich had taken from the rest of us”.

“I felt like a missionary seeing his flock for the first time,” he explained when he recalled casing Dropmore House, the country house of the press baron Viscount Kemsley, on a rainy night in 1956 and squinting through the window at the well-heeled guests sitting down to dinner. “I decided these people were my life’s work.”

Always a meticulous planner, Scott bought a new suit before each job, so that he would not look out of place in the premises he was burgling. Fear, the possibility of capture, excited him.

During one break-in “a titled lady appeared at the top of the stairs. ‘Everything’s all right, madam,’ I shouted up, and she went off to bed thinking I was the butler.” On other occasions, if disturbed by the occupier, he would shout reassuringly: “It’s only me!”

In all, by his own reckoning, Scott stole jewels, furs and artworks worth more than £30 million. He held none of his victims in great esteem (“upper-class prats chattering in monosyllables”). The roll-call of “marks” from whom he claimed to have stolen valuables included Zsa Zsa Gabor, Lauren Bacall, Elizabeth Taylor, Vivien Leigh, Sophia Loren, Maria Callas and the gambling club and zoo owner John Aspinall. “Robbing that bastard Aspinall was one of my favourites,” he recollected. “Sophia Loren got what she deserved too.”

Scott stole a £200,000 necklace from the Italian star when she was in Britain filming The Millionairess in 1960. Billed in the newspapers as Britain’s biggest jewellery theft, it yielded Scott £30,000 from a “fence”. After Miss Loren had pointed at him on television saying: “I come from a long line of gipsies. You will have no luck,” Scott lost every penny in the Palm Beach Casino at Cannes.

In the 1950s and 1960s he pinpointed his targets by perusing the society columns in the Daily Mail and Daily Express. Nor did he ease up with the approach of middle-age; in the 1980s he was still scaling walls and drainpipes. In one Bond Street caper alone he stole jewellery worth £1.5 million, and in 1985 he was jailed for four years. On his release he expanded his social horizons by becoming a celebrity “tennis bum”, a racquet for hire at a smart London club where — as he put it in his autobiography — he coached still more potential “rich prats”.

By the mid-1990s, Scott had served 12 years in prison in the course of half a dozen separate stretches, and claimed to have laid down his “cane” [jemmy] and retired from a life of crime.

But in 1998 he was jailed for another three and a half years for handling, following the theft of Picasso’s Tête de Femme from the Lefevre Gallery in Mayfair the year before. To the impassive detectives who arrested him, Scott quoted a line from WE Henley: “Under the bludgeonings of chance, my head is bloody but unbowed.” He often drew on literary allusions, quoting Confucius, Oscar Wilde and Proust.

Scott was also a past-master in self-justification of his crimes and misdemeanours: “The people I burgled got rich by greed and skulduggery. They indulged in the mechanics of ostentation — they deserved me and I deserved them. If I rob Ivana Trump, it is just a meeting of two different kinds of degeneracy on a dark rooftop.”

In his memoirs, Gentleman Thief (1995), Scott admitted to an even stronger motivation than fear as he contemplated another “job”: “Even now, after 30 years, it was a sexual thrill.” There was the additional satisfaction in his assumption that the millions reading about his exploits in the papers were silently cheering him on.

John Maynard Keynes on graphs and statistics

23 March, 2013 at 17:49 | Posted in Statistics & Econometrics | Comments Off on John Maynard Keynes on graphs and statistics

In his review in 1938 of Historical Development of the Graphical Representation of Statistical Data, by H. Gray Funkhauser, for The Economic Journal, the great economist writes:

“Perhaps the most striking outcome of Mr. Funkhouser’s researches is the fact of the very slow progress which graphical methods made until quite recently … In the first fifty volumes of the Statistical Journal, 1837-87, only fourteen graphs are printed altogether. It is surprising to be told that Laplace never drew a graph of the normal law of error … Edgeworth made no use of statistical charts as distinct from mathematical diagrams.

Apart from Quetelet and Jevons, the most important influences were probably those of Galton and of Mulhall’s Dictionary, first published in 1884. Galton was indeed following his father and grandfather in this field, but his pioneer work was mainly restricted to meteorological maps, and he did not contribute to the development of the graphical representation of economic statistics.”

So far so good. But then comes the kicker:

“Mr. Funkhouser has made an extremely interesting and valuable contribution to the history of statistical method. I wish, however, that he could have added a warning, supported by horrid examples, of the evils of the graphical method unsupported by tables of figures. Both for accurate understanding, and particularly to facilitate the use of the same material by other people, it is essential that graphs should not be published by themselves, but only when supported by the tables which lead up to them. It would be an exceedingly good rule to forbid in any scientific periodical the publication of graphs unsupported by tables.”

I’m ok with that—if they also forbid the publication of all tables unsupported by graphs. Also if they allow graphs by themselves. Then I’m totally on board.

Andrew Gelman

Well, actually, I think Keynes was more right than wrong, considering how difficult it was back in the 1930s to get hold of other people’s data. Including tables made the data available to other researchers and thereby made it easier to evaluate the validity of the statistical analysis.

Cypern som eurosystemrisk

23 March, 2013 at 12:12 | Posted in Economics | 1 Comment

Men det handlar inte om den direkta ekonomiska smällen som en krasch på Cypern orsakar. Hela det cypriotiska banksystemet är hälften så stort som svenska SEB.

Faran ligger i något helt annat. Ett tänkbart scenario som uppenbarligen diskuterats i EU-kretsar de senaste dagarna är att Cypern faktiskt lämnar valutasamarbetet. Därmed skulle eurobunkerns igenmurade nödutgång, den som en enig politikerarmé deklarerat måste förbli stängd om så helvetet startar skridskoskola, plötsligt börja stå och slå på vid gavel. Befolkningarna i Spanien, Portugal och andra krisländer skulle ju kunna få för sig ett och annat.

Skulle de försöka fly euron skulle det få desto större konsekvenser för Europa …

Precis så är det med hela eurokrisen. Intressen ställs mot varandra. Sunda principer möter en kompromisslös verklighet. Det som verkar klokt i Berlin låter sinnesjukt i Madrid, London eller Nicosia. För varje lösning på ett problem uppstår två nya. Och därför får också till synes små avgöranden oproportioneligt stor betydelse.

Andreas Cervenka

Det börjar med andra ord bli läge för Carl Bastiat Hamilton att åter föreslå att Sverige ska gå med i eurosamarbetet …

Inflation targeting – an unmitigated failure

22 March, 2013 at 16:18 | Posted in Economics | 4 Comments

inflationtargeting

The Riksbank in 1993 announced an official target for CPI inflation of 2 percent. Over the last 15 years, average CPI inflation has equaled 1.4 percent and has thus fallen short of the target by 0.6 percentage points. Has this undershooting of the inflation target had any costs in terms of higher average unemployment? This depends on whether the long-run Phillips curve in Sweden is vertical or not. During the last 15 years, inflation expectations in Sweden have become anchored to the inflation target in the sense that average inflation expectations have been close to the target. The inflation target has thus become credible. If inflation expectations are anchored to the target also when average inflation deviates from the target, the long-run Phillips curve is no longer vertical but downward-sloping. Then average inflation below the credible target means that average unemployment is higher than the rationalexpectations steady-state (RESS) unemployment rate. The data indicate that the average unemployment rate has been 0.8 percentage points higher than the RESS rate over the last 15 years. This is a large unemployment cost of undershooting the inflation target. Some simple robustness tests indicate that the estimate of the unemployment cost is rather robust, but the estimate is preliminary and further scrutiny is needed to assess its robustness.

During 1997-2011, average CPI inflation has fallen short of the inflation target of 2 percent by 0.6 percentage points. But average inflation expectations according to the TNS Sifo Prospera survey have been close to the target. Thus, average inflation expectations have been anchored to the target and the target has become credible. If average inflation expectations are anchored to the target when average inflation differ from the target, the long-run Phillips curve is not vertical. Then lower average inflation means higher average unemployment. The data indicate that average inflation below target has been associated with average unemployment being 0.8 percentage points higher over the last 15 years than would have been the case if average inflation had been equal to the target. This is a large unemployment cost of average inflation below a credible target. Some simple robustness tests indicate that the estimate of the unemployment cost is rather robust, but the estimate is preliminary and further scrutiny is needed to assess its robustness.

The difference between average inflation and average inflation expectations and the apparent existence of a downward-sloping long-run Phillips curve raises several urgent questions that I believe need to be addressed. Why have average inflation expectations exceeded average inflation for 15 years? Why has average inflation fallen below the target for 15 years? Could average inflation have fallen below average inflation expectations and the inflation target without the large unemployment cost estimated here? Could the large unemployment cost have been avoided with a different monetary policy? What are the policy implications for the future? Do these findings make price-level targeting or the targeting of average inflation over a longer period relatively more attractive, since they would better ensure that average inflation over longer periods equals the target?

Lars E.O. Svensson, The Possible Unemployment Cost of Average Inflation below a Credible Target

According to Lars E. O. Svensson – deputy governor of the Riksbank – the Swedish Riksbank has been pursuing a policy during the years 1998-2011 that in reality has made inflation on average 0.6 percentage units lower than the goal set by the Riksbank. The Phillips Curve he estimates shows that unemployment as a result of this overly “austere” inflation level has been almost 1% higher than if one had stuck to the set inflation goal of 2%.

What Svensson is saying, without so many words, is that the Swedish Fed for no reason at all has made people unemployed. As a consequence of a faulty monetary policy the unemployment is considerably higher than it would have been if the Swedish Fed had done its job adequately.

From a more methodological point of view it is of course also interesting to consider the use made of the rational expectations hypothesis in these model-based calculations (and models of the same ilk that abounds in “modern” macroeconomics). When data tells us that “average inflation expectations exceeded average inflation for 15 years” – wouldn’t it be high time to put the REH where it belongs – in the dustbin of history!

To me Svensson’s paper basically confirms what I wrote a couple of months ago:

Models based on REH impute beliefs to the agents that is not based on any real informational considerations, but simply stipulated to make the models mathematically-statistically tractable.

Of course you can make assumptions based on tractability, but then you do also have to take into account the necessary trade-off in terms of the ability to make relevant and valid statements on the intended target system.

Mathematical tractability cannot be the ultimate arbiter in science when it comes to modeling real world target systems. Of course, one could perhaps accept REH if it had produced lots of verified predictions and good explanations. But it has done nothing of the kind. Therefore the burden of proof is on those who still want to use models built on ridiculously unreal assumptions – models devoid of all empirical interest.

In reality, REH is a rather harmful modeling assumption, since it contributes to perpetuating the ongoing transformation of economics into a kind of science-fiction-economics. If economics is to guide us, help us make forecasts, explain or better understand real world phenomena, it is in fact next to worthless.

Data snooping

22 March, 2013 at 10:10 | Posted in Statistics & Econometrics | Comments Off on Data snooping

Naturligtvis kan man inte på minsta sätt bevisa att en tärning är obalanserad genom att sitta och kasta den tills man har lyckats få två 6:or i rad. I princip är detta dock ett vanligt fel. För att man ska kunna göra statistiska tester på ett datamaterial måste materialet vara ett resultat av ett slumpmässigt urval där testförfarandet inte på något sätt är påverkat av vad man redan har noterat i datamaterialet. lantzOm man t.ex. sitter och spelar ett brädspel med tärning, t.ex. Fia med knuff, och efter åtta tärningskast noterar att man tydligen fick 3:or i fyra av dessa åtta kast, kan det inte användas som bevis för att tärningen skulle vara obalanserad. Via beräkningar kan man visa att sannolikheten för att få minst fyra 3:or i en sekvens om åtta slumpmässiga kast med en balanserad tärning i och för sig är lägre än 5 %, men en händelse som redan har ägt rum är ju ingen slumphändelse. Det som hände har redan hänt med sannolikheten 100 %. Denna typ av feltänkande, avsiktligt eller ej, kallas data snooping …

Man kan inte heller vända på analysen och hävda att experimentet skulle ha bevisat att tärningen är välbalanserad om antalet 6:or i experimentet blev lägre än två … Anta t.ex. att tärningen faktiskt är så pass obalanserad att den i medeltal visar en 6:a varannan gång. Sannolikheten för att vi vid två kast ska få två 6:or är då 1/2*1/2 = 1/4, d.v.s. 25 %. Sannolikheten för att få färre än två 6:or av en slump är alltså hela 75 % – trots att tärningen faktiskt är rejält obalanserad!

Randomization is no panacea

21 March, 2013 at 16:49 | Posted in Statistics & Econometrics, Theory of Science & Methodology | Comments Off on Randomization is no panacea

When it comes to questions of causality, randomized controlled trials (RCTs) are nowadays considered some kind of “gold standard” in social sciences and policies. Everything has to be “evidence based,” and the evidence preferably has to come from randomized experiments.

But randomization is basically – just as e. g. econometrics – a deductive method. Given warranted assumptions (manipulability, transitivity, separability, additivity, linearity etc)  this method delivers deductive inferences. The problem, of course, is that we will never completely know when the assumptions are warranted and a fortiori being able to justify our causal conclusions. Although randomization may contribute to controlling for “confounding,” it does not guarantee it, since genuine ramdomness presupposes infinite experimentation and we know all real experimentation is finite. Even if randomization may help to establish average causal effects, it says nothing of individual effects unless homogeneity is added to the list of assumptions. Real target systems are seldom epistemically isomorphic to our axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of  the conclusions reached from within these epistemically convenient models/systems. Causal evidence generated by randomization procedures may be valid in “closed” models, but what we usually are interested in, is causal evidence in the real target system we happen to live in.

So RCTs are not at all the “gold standard” they have lately often been portrayed as. RCTs usually do not provide evidence that their results are exportable to other target systems. RCTs cannot be taken for granted to give generalizable results. That something works somewhere is no warranty for it to work for us or even that it works generally.

Even though I can present evidence for being able to sharpen my pencils with Rube Goldberg’s ingenious construction – mainly becuase flying kites in my windy hometown (Lund, Sweden) is no match – it does not come with a warranted export license. Most people would probably find ordinary pencil sharpeners more efficacious.

När dåliga prognoser är bättre än bra

21 March, 2013 at 10:59 | Posted in Economics | 1 Comment

ripleVad är värst, en dålig prognos eller ingen prognos? Svaret är enkelt. Så fort du exponeras för en prognos är du i en sämre position än du var innan …

Expertprognoser gör med all sannolikhet mer skada än nytta. Det är därför det lönar sig att snabbt bläddra förbi tidningsartiklar med rubriker som ‘Så kommer börsen gå i år’ …

Tänk dig att du har som jobb att sköta ditt företags valutaväxlingar … Du måste bestämma att antingen säkra växelkursen redan nu, eller vänta tills beloppet anländer och växla till den kurs som gäller då … Som tur är har du analytikernas dollarprognoser till hjälp. De gör det inte ett dugg lättare att förutspå dollarkursen. Men de kan hjälpa dig ändå.

Om du lyckas göra rätt spelar inte analyserna någon större roll. Men om dollarn faller som en sten och du har valt att inte säkra växelkursen, kommer företagsledningen att vilja veta varför du har sumpat bort företagets penga … Du kan dra en lång historia om historiska valutatrender, ekonomisk tillväxt, betalningbalans och ränteskillnader. Till slut kommer alla att hålla med om att du agerade rätt mot bakgrund av den information du hade på förhand.

Analyserna gör att du kommer undan. Särskilt de som hade mest fel … Prognoserna har inget ekonomiskt värde, vare sig för företaget eller samhället. Värdet är att de räddar ditt skinn.

Fondförvaltare Alf Riple – med bakgrund som chefsanalytiker på Nordea och rådgivare på norska Finansdepartementet – har skrivit en suveränt bra bok. Läs den!

Den ensamma människan

21 March, 2013 at 09:54 | Posted in Varia | Comments Off on Den ensamma människan

Jag tror på den ensamma människan,
på henne som vandrar ensam,
som inte hundlikt löper till vittring,
som inte varglikt flyr för mänskovittring:
På en gång människa och anti-människa.

Hur nå gemenskap?
Fly den övre och yttre vägen:
Det som är boskap i andra är boskap också i dig.
Gå den undre och inre vägen:
Det som är botten i dig är botten också i andra.

Svårt att vänja sig vid sig själv.
Svårt att vänja sig av med sig själv.

Den som gör det skall ändå aldrig bli övergiven.
Den som gör det skall ändå alltid förbli solidarisk.
Det opraktiska är det enda praktiska
i längden.

Gunnar Ekelöf

IS-LM is bad economics no matter what Krugman says

20 March, 2013 at 14:26 | Posted in Economics | 13 Comments

Paul Krugman has a post up on his blog once again defending “the whole enterprise of Keynes/Hicks macroeconomic theory” and especially his own somewhat idiosyncratic version of IS-LM.

The main problem is simpliciter that there is no such thing as a Keynes-Hicks macroeconomic theory!

So, let us get some things straight.

There is nothing in the post-General Theory writings of Keynes that suggests him considering Hicks’s IS-LM anywhere near a faithful rendering of his thought. In Keynes’s canonical statement of the essence of his theory in the 1937 QJE-article there is nothing to even suggest that Keynes would have thought the existence of a Keynes-Hicks-IS-LM-theory anything but pure nonsense. So of course there can’t be any “vindication for the whole enterprise of Keynes/Hicks macroeconomic theory” – simply because “Keynes/Hicks” never existed.

And it gets even worse!

John Hicks, the man who invented IS-LM in his 1937 Econometrica review of Keynes’ General TheoryMr. Keynes and the ‘Classics’. A Suggested Interpretation – returned to it in an article in 1980 – IS-LM: an explanation – in Journal of Post Keynesian Economics. Self-critically he wrote:

I accordingly conclude that the only way in which IS-LM analysis usefully survives — as anything more than a classroom gadget, to be superseded, later on, by something better – is in application to a particular kind of causal analysis, where the use of equilibrium methods, even a drastic use of equilibrium methods, is not inappropriate. I have deliberately interpreted the equilibrium concept, to be used in such analysis, in a very stringent manner (some would say a pedantic manner) not because I want to tell the applied economist, who uses such methods, that he is in fact committing himself to anything which must appear to him to be so ridiculous, but because I want to ask him to try to assure himself that the divergences between reality and the theoretical model, which he is using to explain it, are no more than divergences which he is entitled to overlook. I am quite prepared to believe that there are cases where he is entitled to overlook them. But the issue is one which needs to be faced in each case.

When one turns to questions of policy, looking toward the future instead of the past, the use of equilibrium methods is still more suspect. For one cannot prescribe policy without considering at least the possibility that policy may be changed. There can be no change of policy if everything is to go on as expected-if the economy is to remain in what (however approximately) may be regarded as its existing equilibrium. It may be hoped that, after the change in policy, the economy will somehow, at some time in the future, settle into what may be regarded, in the same sense, as a new equilibrium; but there must necessarily be a stage before that equilibrium is reached …

I have paid no attention, in this article, to another weakness of IS-LM analysis, of which I am fully aware; for it is a weakness which it shares with General Theory itself. It is well known that in later developments of Keynesian theory, the long-term rate of interest (which does figure, excessively, in Keynes’ own presentation and is presumably represented by the r of the diagram) has been taken down a peg from the position it appeared to occupy in Keynes. We now know that it is not enough to think of the rate of interest as the single link between the financial and industrial sectors of the economy; for that really implies that a borrower can borrow as much as he likes at the rate of interest charged, no attention being paid to the security offered. As soon as one attends to questions of security, and to the financial intermediation that arises out of them, it becomes apparent that the dichotomy between the two curves of the IS-LM diagram must not be pressed too hard.

 
The editor of JPKE, Paul Davidson, gives the background to Hicks’s article:

I originally published an article about Keynes’s finance motive — which in 1937 Keynes added to his other liquidity preference motives (transactions, precautionary, speculative motives) , I showed that adding this finance motive required that Hicks’s IS curve and LM curves to be interdependent — and thus when the IS curve shifted so would the LM curve.
Hicks and I then discussed this when we met several times.
When I first started to think about the ergodic vs. nonergodic dischotomy, I sent to Hicks some preliminary drafts of articles I would be writing about nonergodic processes. Then John and I met several times to discuss this matter further and I finally convinced him to write the article — which I published in the Journal of Post Keynesian Economics– in which he renounces the IS-LM apparatus. Hicks then wrote me a letter in which he thought the word nonergodic was wonderful and said he wanted to lable his approach to macroeconomics as nonergodic!

So – back in 1937 John Hicks said that he was building a model of John Maynard Keynes’ General Theory. In 1980 he openly admits he wasn’t.

What Hicks acknowledges in 1980 is basically that his original review totally ignored the very core of Keynes’ theory – uncertainty. In doing this he actually turned the train of macroeconomics on the wrong tracks for decades. It’s about time that neoclassical economists – as Krugman, Mankiw, or what have you – set the record straight and stop promoting something that the creator himself admits was a total failure. Why not study the real thing itself – General Theory – in full and without looking the other way when it comes to non-ergodicity and uncertainty?

Paul Krugman persists in talking about a Keynes-Hicks-IS-LM-model that really never existed. It’s deeply disappointing. You would expect more from a Nobel prize winner.

Svensk skola – en av de mest segregerade i världen

20 March, 2013 at 10:29 | Posted in Education & School | Comments Off on Svensk skola – en av de mest segregerade i världen

Internationell forskning pekar på farorna med full valfrihet i det svenska skolsystemet. Sedan friskolereformen i början av 90-talet har det svenska skattefinansierade skolsystemet utvecklats till det mest marknadsanpassade och konkurrensutsatta i världen. Enligt Henry Levin, professor i utbildningsekonomi på lärarhögskolan vid Columbia University i USA, finns stora faror med detta, och han varnar för att det nuvarande svenska skolsystemet driver på segregationen ytterligare.

Lyssna på dagens P1-intervju med Levin här.

Misunderstanding the p-value – here we go again

19 March, 2013 at 20:59 | Posted in Statistics & Econometrics | 6 Comments

A non-trivial part of teaching statistics is made up of learning students to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the probabilities generated by these statistical tests – p values – really are, still most students misinterpret them.

Giving a statistics course for the Swedish National Research School in History, I asked the students at the exam to explain how one should correctly interpret p-values. Although the correct definition is p(data|null hypothesis), a majority of the students either misinterpreted the p value as being the likelihood of a sampling error (which of course is wrong, since the very computation of the p value is based on the assumption that sampling errors are what causes the sample statistics not coinciding with the null hypothesis) or that the p value is the probability of the null hypothesis being true, given the data (which of course also is wrong, since that is p(null hypothesis|data) rather than the correct p(data|null hypothesis)).

This is not to blame on students’ ignorance, but rather on significance testing not being particularly transparent (conditional probability inference is difficult even to those of us who teach and practice it). A lot of researchers fall pray to the same mistakes. So – given that it anyway is very unlikely than any population parameter is exactly zero, and that contrary to assumption most samples in social science and economics are not random or having the right distributional shape – why continue to press students and researchers to do null hypothesis significance testing, testing that relies on weird backward logic that students and researchers usually don’t understand?

That media often misunderstand what p-values and significance testing are all about is well-known. Andrew Gelman gives a recent telling example:

The New York Times has a feature in its Tuesday science section, Take a Number … Today’s column, by Nicholas Balakar, is in error. The column begins:

“When medical researchers report their findings, they need to know whether their result is a real effect of what they are testing, or just a random occurrence. To figure this out, they most commonly use the p-value.”

This is wrong on two counts. First, whatever researchers might feel, this is something they’ll never know. Second, results are a combination of real effects and chance, it’s not either/or.

Perhaps the above is a forgivable simplification, but I don’t think so; I think it’s a simplification that destroys the reason for writing the article in the first place. But in any case I think there’s no excuse for this, later on:

“By convention, a p-value higher than 0.05 usually indicates that the results of the study, however good or bad, were probably due only to chance.”

This is the old, old error of confusing p(A|B) with p(B|A). I’m too rushed right now to explain this one, but it’s in just about every introductory statistics textbook ever written. For more on the topic, I recommend my recent paper, P Values and Statistical Practice, which begins:

“The casual view of the P value as posterior probability of the truth of the null hypothesis is false and not even close to valid under any reasonable model, yet this misunderstanding persists even in high-stakes settings … The formal view of the P value as a probability conditional on the null is mathematically correct but typically irrelevant to research goals (hence, the popularity of alternative—if wrong—interpretations) …”

I can’t get too annoyed at science writer Bakalar for garbling the point—it confuses lots and lots of people—but, still, I hate to see this error in the newspaper.

On the plus side, if a newspaper column runs 20 times, I guess it’s ok for it to be wrong once—we still have 95% confidence in it, right?

Statistical significance doesn’t say that something is important or true. And since there already are far better and more relevant testing that can be done (see e. g. here and here), it is high time to give up on this statistical fetish. 

The limits to probabilistic reasoning

19 March, 2013 at 17:40 | Posted in Statistics & Econometrics, Theory of Science & Methodology | Comments Off on The limits to probabilistic reasoning

Probabilistic reasoning in science – especially Bayesianism – reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but – even granted this questionable reductionism – it’s not self-evident that rational agents really have to be probabilistically consistent. There is no strong warrant for believing so. Rather, there are strong evidence for us encountering huge problems if we let probabilistic reasoning become the dominant method for doing research in social sciences on problems that involve risk and uncertainty.

probIn many of the situations that are relevant to economics one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind, and that in those situations it is not really possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.

Say you have come to learn (based on own experience and tons of data) that the probability of you becoming unemployed in Sweden is 10%. Having moved to another country (where you have no own experience and no data) you have no information on unemployment and a fortiori nothing to help you construct any probability estimate on. A Bayesian would, however, argue that you would have to assign probabilities to the mutually exclusive alternative outcomes and that these have to add up to 1, if you are rational. That is, in this case – and based on symmetry – a rational individual would have to assign probability 10% to becoming unemployed and 90% of becoming employed.

That feels intuitively wrong though, and I guess most people would agree. Bayesianism cannot distinguish between symmetry-based probabilities from information and symmetry-based probabilities from an absence of information. In these kinds of situations most of us would rather say that it is simply irrational to be a Bayesian and better instead to admit that we “simply do not know” or that we feel ambiguous and undecided. Arbitrary an ungrounded probability claims are more irrational than being undecided in face of genuine uncertainty, so if there is not sufficient information to ground a probability distribution it is better to acknowledge that simpliciter, rather than pretending to possess a certitude that we simply do not possess.

I think this critique of Bayesianism is in accordance with the views of KeynesA Treatise on Probability (1921) and General Theory (1937). According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but rational expectations. Sometimes we “simply do not know.” Keynes would not have accepted the view of Bayesian economists, according to whom expectations “tend to be distributed, for the same information set, about the prediction of the theory.” Keynes, rather, thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief”, beliefs that have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents modeled by probabilistically reasoning Bayesian economists.

In an interesting article on his blog, John Kay shows that these strictures on probabilistic-reductionist reasoning do not only apply to everyday life and science, but also to the law:

English law recognises two principal standards of proof. The criminal test is that a charge must be “beyond reasonable doubt”, while civil cases are decided on “the balance of probabilities”.

The meaning of these terms would seem obvious to anyone trained in basic statistics. Scientists think in terms of confidence intervals – they are inclined to accept a hypothesis if the probability that it is true exceeds 95 per cent. “Beyond reasonable doubt” appears to be a claim that there is a high probability that the hypothesis – the defendant’s guilt – is true. Perhaps criminal conviction requires a higher standard than the scientific norm – 99 per cent or even 99.9 per cent confidence is required to throw you in jail. “On the balance of probabilities” must surely mean that the probability the claim is well founded exceeds 50 per cent.

And yet a brief conversation with experienced lawyers establishes that they do not interpret the terms in these ways. One famous illustration supposes you are knocked down by a bus, which you did not see (that is why it knocked you down). Say Company A operates more than half the buses in the town. Absent other evidence, the probability that your injuries were caused by a bus belonging to Company A is more than one half. But no court would determine that Company A was liable on that basis.

A court approaches the issue in a different way. You must tell a story about yourself and the bus. Legal reasoning uses a narrative rather than a probabilistic approach, and when the courts are faced with probabilistic reasoning the result is often a damaging muddle …

When I have raised these issues with people with scientific training, they tend to reply that lawyers are mostly innumerate and with better education would learn to think in the same way as statisticians. Probabilistic reasoning has become the dominant method of structured thinking about problems involving risk and uncertainty – to such an extent that people who do not think this way are derided as incompetent and irrational …

It is possible – common, even – to believe something is true without being confident in that belief. Or to be sure that, say, a housing bubble will burst without being able to attach a high probability to any specific event, such as “house prices will fall 20 per cent in the next year”. A court is concerned to establish the degree of confidence in a narrative, not to measure a probability in a model.

Such narrative reasoning is the most effective means humans have developed of handling complex and ill-defined problems … Probabilistic thinking … often fails when we try to apply it to idiosyncratic events and open-ended problems. We cope with these situations by telling stories, and we base decisions on their persuasiveness. Not because we are stupid, but because experience has told us it is the best way to cope. That is why novels sell better than statistics texts.

Guess I must be doing something right

19 March, 2013 at 11:31 | Posted in Varia | Comments Off on Guess I must be doing something right

happy-cartoon-boy-jumping-and-smiling3 Yours truly launched this blog two years ago. The number of visitors has  increased steadily. From having only a couple of hundred visits per month at the start, I’m now having almost 60 000 visits per month. A blog is sure not a beauty contest, but given the rather “wonkish” character of the blog – with posts mostly on economic theory, statistics, econometrics, theory of science and methodology – it’s rather gobsmacking that so many are interested and take their time to read and comment on it. I am – of course – truly awed, honoured and delighted!

Insupportable equilibrium

19 March, 2013 at 10:57 | Posted in Economics, Theory of Science & Methodology | Comments Off on Insupportable equilibrium

Theoretical physicist Mark Buchanan has some interesting reflections on equilibrium thought in economics in his upcoming book Forecast: What Physics, Meteorology and the Natural Sciences Can Teach Us About Economics:

balancepencilFor several decades, academics have assumed that the economy is in a stable equilibrium. Distilled into a few elegant lines of mathematics by the economists Kenneth Arrow and Gerard Debreu back in the 1950s, the assumption has driven most thinking about business cycles and financial markets ever since. It informs the idea, still prevalent on Wall Street, that markets are efficient — that the greedy efforts of millions of individuals will inevitably push prices toward some true fundamental value.

Problem is, all efforts to show that a realistic economy might actually reach something like the Arrow-Debreu equilibrium have met with failure. Theorists haven’t been able to prove that even trivial, childlike models of economies with only a few commodities have stable equilibria. There is no reason to think that the equilibrium so prized by economists is anything more than a curiosity.

It’s as if mathematical meteorologists found beautiful equations for a glorious atmospheric state with no clouds or winds, no annoying rain or fog, just peaceful sunshine everywhere. In principle, such an atmospheric state might exist, but it tells us nothing about the reality we care about: our own weather …

We’ll never understand economies and markets until we get over the nutty idea that they alone — unlike almost every other complex system in the world — are inherently stable and have no internal weather. It’s time we began learning about the socioeconomic weather, categorizing its storms, and learning either how to prevent them or how to see them coming and protect ourselves against them.

Cuts – the wrong cure

19 March, 2013 at 08:28 | Posted in Economics, Politics & Society | Comments Off on Cuts – the wrong cure

 

Inequality continues to grow – even in Sweden

18 March, 2013 at 19:21 | Posted in Economics, Politics & Society | 4 Comments

Inequality continues to grow all over the world.
And in case you think it’s different in e. g. Sweden, you should take a look at some new data from Statistics Sweden.

The Gini coefficient is a measure of inequality (where a higher number signifies greater inequality) and graphing with Gretl we get this for the disposable income distribution:
SwedenGini1980to2011

Sometimes a graph says more than a thousand words …

I would say that what we see happen in Sweden is deeply disturbing. The rising inequality is outrageous – not the least since it has to a large extent to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Societies where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implode. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed. It’s high time to put an end to this the worst Juggernaut of our time!

EconTalk transmogrifies Keynes

18 March, 2013 at 14:20 | Posted in Economics | Comments Off on EconTalk transmogrifies Keynes

econtalkYesterday, on my way home on train after conferencing in Stockholm, I tried to beguile the way by listening to a podcast of EconTalk where Garett Jones of George Mason University talked with EconTalk host Russ Roberts about the ideas of Irving Fisher on debt and deflation.

Jones’s thoughts on Fisher were thought-provoking and interesting, but in the middle of the discussion Roberts started to ask questions on the relation between Fisher’s ideas and those of Keynes, saying more or less something like “Keynes generated a lot of interest in his idea that the labour market doesn’t clear … because the price for labour does not adjust, i. e. wages are ‘sticky’ or ‘inflexible’.”

This is of course pure nonsense. For although Keynes in General Theory devoted substantial attention to the subject of wage rigidities, he certainly did not hold the view that wage rigidity was the reason behind high unemployment and other macroeconomic problems. To Keynes, recessions, depressions and faultering labour markets were not basically a problem of “sticky wages.”

Since unions/workers, contrary to classical assumptions, make wage-bargains in nominal terms, they will – according to Keynes – accept lower real wages caused by higher prices, but resist lower real wages caused by lower nominal wages. However, Keynes held it incorrect to attribute “cyclical” unemployment to this diversified agent behaviour. During the depression money wages fell significantly and – as Keynes noted – unemployment still grew. Thus, even when nominal wages are lowered, they do not generally lower unemployment.

In any specific labour market, lower wages could, of course, raise the demand for labour. But a general reduction in money wages would leave real wages more or less unchanged. The reasoning of the classical economists was, according to Keynes, a flagrant example of the “fallacy of composition.” Assuming that since unions/workers in a specific labour market could negotiate real wage reductions via lowering nominal wages, unions/workers in general could do the same, the classics confused micro with macro.

Lowering nominal wages could not – according to Keynes – clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. But to Keynes it would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen by Keynes as a general substitute for an expansionary monetary or fiscal policy.

Even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

So, what Keynes actually did argue in General Theory, was that the classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong.

To Keynes, flexible wages would only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labor market.

To mainstream neoclassical theory the kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility. Keynes on the other hand writes in General Theory:

The classical school [maintains that] while the demand for labour at the existing money-wage may be satisfied before everyone willing to work at this wage is employed, this situation is due to an open or tacit agreement amongst workers not to work for less, and that if labour as a whole would agree to a reduction of money-wages more employment would be forthcoming. If this is the case, such unemployment, though apparently involuntary, is not strictly so, and ought to be included under the above category of ‘voluntary’ unemployment due to the effects of collective bargaining, etc …

The classical theory … is best regarded as a theory of distribution in conditions of full employment. So long as the classical postulates hold good, unemployment, which is in the above sense involuntary, cannot occur. Apparent unemployment must, therefore, be the result either of temporary loss of work of the ‘between jobs’ type or of intermittent demand for highly specialised resources or of the effect of a trade union ‘closed shop’ on the employment of free labour. Thus writers in the classical tradition, overlooking the special assumption underlying their theory, have been driven inevitably to the conclusion, perfectly logical on their assumption, that apparent unemployment (apart from the admitted exceptions) must be due at bottom to a refusal by the unemployed factors to accept a reward which corresponds to their marginal productivity …

Obviously, however, if the classical theory is only applicable to the case of full employment, it is fallacious to apply it to the problems of involuntary unemployment – if there be such a thing (and who will deny it?). The classical theorists resemble Euclidean geometers in a non-Euclidean world who, discovering that in experience straight lines apparently parallel often meet, rebuke the lines for not keeping straight – as the only remedy for the unfortunate collisions which are occurring. Yet, in truth, there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics. We need to throw over the second postulate of the classical doctrine and to work out the behaviour of a system in which involuntary unemployment in the strict sense is possible.

gtUnfortunately, Roberts’s statement is not the only example of this kind of utter nonsense on Keynes. Similar distortions of Keynes’s views can be found in , e. g., the economics textbooks of the “New Keynesian” – a grotesque misnomer – Greg Mankiw. How is this possible? Probably because these economists have but a very superficial acquaintance with Keynes’s own works, and rather depend on second-hand sources like Hansen, Samuelson, Hicks and the likes.

Fortunately there is a solution to the problem. Keynes books are still in print. Read them!!

Inequality and well-being

16 March, 2013 at 11:42 | Posted in Economics, Politics & Society | Comments Off on Inequality and well-being

pickett
Source

How to argue with economists

15 March, 2013 at 08:57 | Posted in Economics | 1 Comment

argueIn the increasingly contentious world of pop economics, you … may find yourself in an argument with an economist. And when this happens, you should be prepared, because many of the arguments that may seem at first blush to be very powerful and devastating are, in fact, pretty weak tea …
Principle 1: Credentials are not an argument.

Example: “You say Theory X is wrong…but don’t you know that Theory X is supported by Nobel Prize winners A, B, and C, not to mention famous and distinguished professors D, E, F, G, and H?”

Suggested Retort: Loud, barking laughter.

Alternative Suggested Retort: “Richard Feynman said that ‘Science is the belief in the ignorance of experts.’ And you’re not going to argue with HIM, are you?”

Reason You’re Right: Credentials? Gimme a break. Nobody accepts received wisdom from sages these days. Show me the argument!

Principle 2: “All theories are wrong” is false.

Example: “Sure, Theory X fails to forecast any variable of interest or match important features of the data. But don’t you know that all models are wrong? I mean, look at Newton’s Laws…THOSE ended up turning out to be wrong, ha ha ha.”

Suggested Retort: Empty an entire can of Silly String onto anyone who says this. (I carry Silly String expressly for this purpose.)

Alternative Suggested Retort: “Yeah, well, when your theory is anywhere near as useful as Newton’s Laws, come back and see me, K?”

Reason You’re Right: To say models are “wrong” is fatuous semantics; philosophically, models can only have degrees of predictive power within domains of validity. Newton’s Laws are only “wrong” if you are studying something very small or moving very fast. For most everyday applications, Newton’s Laws are very, very right.

Principle 3: “We have theories for that” is not good enough.

Example: “How can you say that macroeconomists have ignored Phenomenon X? We have theories in which X plays a role! Several, in fact!”

Suggested Retort: “Then how come no one was paying attention to those theories before Phenomenon X emerged and slapped us upside the head?”

Reason You’re Right: Actually, there are two reasons. Reason 1 is that it is possible to make many models to describe any phenomenon, and thus there is no guarantee that Phenomenon X is correctly describe by Theory Y rather than some other theory, unless there is good solid evidence that Theory Y is right, in which case economists should be paying more a lot attention to Theory Y. Reason 2 is that if the profession doesn’t have a good way to choose which theories to apply and when, then simply having a bunch of theories sitting around gathering dust is a little pointless.

Principle 4: Argument by accounting identity almost never works.

Example: “But your theory is wrong, because Y = C + I + G!”

Suggested Retort: “If my theory violates an accounting identity, wouldn’t people have noticed that before? Wouldn’t this fact be common knowledge?”

Reason You’re Right: Accounting identities are mostly just definitions. Very rarely do definitions tell us anything useful about the behavior of variables in the real world. The only exception is when you have a very good understanding of the behavior of all but one of the variables in an accounting identity, in which case of course it is useful. But that is a very rare situation indeed.

Principle 5: The Efficient Markets Hypothesis does not automatically render all models useless.

Example: “But if your model could predict financial crises, then people could use it to conduct a riskless arbitrage; therefore, by the EMH, your model cannot predict financial crises.”

Suggested Retort: “By your logic, astrophysics can never predict when an asteroid is going to hit the Earth.”

Reason You’re Right: Conditional predictions are different than unconditional predictions. A macro model that is useful for making policy will not say “Tomorrow X will happen.” It will say “Tomorrow X will happen unless you do something to stop it.” If policy is taken to be exogenous to a model (a “shock”), then the EMH does not say anything about whether you can see an event coming and do something about it.

Principle 6: Models that only fit one piece of the data are not very good models.

Example: “Sure, this model doesn’t fit facts A, B, and C, but it does fit fact D, and therefore it is a ‘laboratory’ that we can use to study the impact of changes in the factors that affect D.”

Suggested Retort: “Nope!”

Reason You’re Right: Suppose you make a different model to fit each phenomenon. Only if all your models don’t interact will you be able to use each different model to study its own phenomenon. And this is highly unlikely to happen. Also, it’s generally pretty easy to make a large number of different models that fit any one given fact, but very hard to make models that fit a whole bunch of facts at once. For these reasons, many philosophers of science claim that science theories should explain a whole bunch of phenomena in terms of some smaller, simpler subset of underlying phenomena. Or, in other words, wrong theories are wrong.

Principle 7: The message is not the messenger.

Example: “Well, that argument is being made by Person X, who is obviously just angry/a political hack/ignorant/not a real economist/a commie/stupid/corrupt.”

Suggested Retort: “Well, now it’s me making the argument! So what are you going to say about me?”

Reason You’re Right: This should be fairly obvious, but people seem to forget it. Even angry hackish ignorant stupid communist corrupt non-economists can make good cogent correct arguments (or, at least, repeat them from some more reputable source!). Arguments should be argued on the merits. This is the converse of Principle 1.

There are, of course, a lot more principles than these … The set of silly things that people can and will say to try to beat an interlocutor down is, well, very large. But I think these seven principles will guard you against much of the worst of the silliness.

Noah Smith

Lönesänkarna

13 March, 2013 at 16:11 | Posted in Economics, Politics & Society | Comments Off on Lönesänkarna

Efter den uppmärksammade SVT dokumentären Lönesänkarna (se här) skrev en av Svenska Dagbladets ledarskribenter, Ivar Arpi, att på “1970-talet var kapitalandelen, jämfört med löneandelen, nästan noll” och att sedan 1970-talet har “det stora flertalet fått det bättre – även de med lägst inkomster.”

Och detta grodors plums och ankors plask ska man behöva läsa år 2013.

Och som om det inte var nog med detta skriver Per Krusell – ledamot av Kungliga Vetenskapsakademien och sedan 2004 ordinarie ledamot av Kommittén för Sveriges Riksbanks pris i ekonomisk vetenskap till Alfred Nobels minne – att han sett programmet och att fasan han kände handlade om “känslan av att som ekonom ‘ha missat något, i vår analys av samhällsekonomin.”  Efter att ha funderat en stund mådde dock professorn betydligt bättre eftersom han då  insett att “det ligger väldigt lite i den huvudsakliga tesen i programmet, dvs att en ‘överenskommelse’ om att sänka löneandelen skett, kanske med syfte att öka sysselsättningen, men att en sysselsättningsökning sedan uteblev.”

Snacka går ju alltid, men det är ändå bättre om man vet vad man pratar om och kan belägga sina uttalanden. Så här ser det nämligen ut:

Källa: SWIID 3.0

(Ju lägre Ginikoefficient, desto jämlikare inkomstfördelning. Sedan 1981 har ojämlikheten i inkomstfördelningen i Sverige trendmässigt ökat brant.)

Källa: IFN, Roine och Waldenström 2008
Grafik: Idégrafik

I den ekonomisk-politiska debatten hör man ofta marknadsfundamentalismens förespråkare likt Svenska Dagbladets ledarskribenter säga att ojämlikhet inte är något problem. Anledningen sägs i huvudsak vara två.

Pro primo – hästskitsteoremet – enligt vilket sänkta skatter och ökad välfärd för de rika ändå så småningom sipprar ner till de fattiga. Göd hästen och fåglarna kan äta sig mätta på spillningen.

Pro secondo – att så länge alla har samma chans att bli rika är ojämlikheten oproblematisk.

Hästskitsteoremet (“trickle-down effect”) visade omfattande forskning redan under Thatcher-Reagan-eran hörde mytvärlden till.

Och för några år sedan visade Alan Krueger – ekonomiprofessor vid Princeton-universitetet – med sin Gatsbykurva att även det andra försöket till försvar av ojämlikhet hör hemma i sagornas värld:

[På den vertikala axeln visas hur mycket en enprocentig ökning i din fars inkomster påverkar dina förväntade inkomster (ju högre tal, desto lägre förväntad social rörlighet), och på den horisontella axeln visas Ginikoefficienten, som mäter ojämlikhet (ju högre tal, desto högre ojämlikhet)]

Tydligare än så här går det knappt att se att jämlika länder också är de med störst social rörlighet – och att det därför börjar bli dags att ta itu med de ökade inkomst- och förmögenhetsklyftorna. Så även i Sverige, där nyreviderade data från SCB visar hur utvecklingen av disponibel inkomst per konsumtionsenhet (exklusive kapitalvinst efter deciler, samtliga personer 1995-2010, medelvärden i tusen kr per k.e. i 2010 års priser) de senaste åren har sett ut:

Källa: SCB och egna beräkningar

Och än värre är det om man tittar på förmögenhetsutvecklingen.

Ojämlikheten ökar i Sverige. Att det är så beror i hög grad på politiska beslut – som att exempelvis skära i a-kassa och sjukfärsäkring. Men det är också ett uttryck för ett ideologiskifte som under trettio års tid förvandlat Sverige från ett föregångsland vad gäller jämlikhet till att bli ett av de länder där inkomst- och förmögenhetsklyftorna ökar mest i världen.

För dem som i likhet med Ivar Arpi och en del mer eller mindre prominenta nationalekonomer inte tror att det är så illa i Sverige när det gäller inkomst- och förmögenhetsfördelning, föreslår jag att kika lite närmre på diagrammet nedan över hur genomsnittsinkomsterna (uttryckt i 2009 års prisnivå) för de övre 0.1% och de lägsta 90% utvecklats sedan 1980 i Sverige:

.

Källa: The World Top Incomes Database

Det är hög tid att sätta stopp för det nya klassamhället. Det är hög tid att se till att klyftorna slutar växa i det svenska samhället. Det är hög tid att det nyliberala systemskiftet i Sverige upphör!

Mästarens återkomst

12 March, 2013 at 22:40 | Posted in Economics | 4 Comments

För den som vill förstå och kunna förklara finansiella och ekonomiska kriser är Robert Skidelskys Mästarens återkomst (Karnevals förlag, 2011) en utomordentlig startpunkt. Genom att lyfta fram Keynes teorier om osäkerhetens och förväntningarnas roll ger Skidelsky en nyttig motbild till den verklighetsfrämmande och modellartade bild av ekonomin som den förhärskande nationalekonomiska teorin ger.
Skidelsky visar övertygande att den neoklassiska makroteorins banérförare – vare sig det är en Robert Lucas, Thomas Sargent eller Greg Mankiw – inte bara misslyckats med att förutspå eller förklara den nuvarande krisen. De har i själva verket med sina teorier och modeller aktivt bidragit till den. I sitt förord skriver författaren:

Vissa recensenter har anklagat mig för att ge en vulgärversion av de ortodoxa teorierna som jag anser ha varit själva ursprunget till krisen. Det har hävdats att jag inte tagit tillräcklig hänsyn till de förbehåll och undantag till teorin om effektiva marknader som erkänts av dess egna akademiska förkämpar och inte heller till de många varierande åsikter som existerar inom den ekonomiska professionen. Mitt försvar inför den senare sanklagelsen är att det är Chicagoskolans teorier som dominerat under de senaste 30 åren och de som har tänkt annorlunda har blivit marginaliserade inom professionen. När det gäller den första anklagelsepunkten är det alltid i sin vulgära version som teorier tillämpas och det borde vara ett test på god ekonomisk teori att dess vulgarisering inte leder till usel politik.

Sporadic blogging

12 March, 2013 at 11:48 | Posted in Varia | Comments Off on Sporadic blogging

Touring again. This time conferencing in the most beautiful capital in the world – Stockholm.
Regular blogging will be resumed early next week.
 
sodermalm

Naket och självutlämnande

11 March, 2013 at 17:54 | Posted in Varia | Comments Off on Naket och självutlämnande

 

 
Naken och självutlämnande musik.
En rak höger i solar plexus.
Hjärtats mörker.
Peter LeMarc imponerade för tjugofem år sedan. Det gör han fortfarande.

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.