Anders Borg skäller – men vem bryr sig?

29 February, 2012 at 12:19 | Posted in Varia | Comments Off on Anders Borg skäller – men vem bryr sig?

Advertisements

ECB förklarar Grekland bankrutt!

28 February, 2012 at 12:01 | Posted in Economics | Comments Off on ECB förklarar Grekland bankrutt!

I ett pressmeddelande från ECB framgår nu att man i princip bankruttförklarar Grekland. Detta kommer efter den senaste nedgraderingen från kreditvärderingsinstitutet Standard & Poor’s, som i gårkväll meddelade att kreditbetyget för Grekland sänks till “selektiv default” från tidigare CC på lång sikt och C på kort sikt:

28 February 2012 – Eligibility of Greek bonds used as collateral in Eurosystem monetary policy operations
The Governing Council of the European Central Bank (ECB) has decided to temporarily suspend the eligibility of marketable debt instruments issued or fully guaranteed by the Hellenic Republic for use as collateral in Eurosystem monetary policy operations. This decision takes into account the rating of the Hellenic Republic as a result of the launch of the private sector involvement offer.

At the same time, the Governing Council decided that the liquidity needs of affected Eurosystem counterparties can be satisfied by the relevant national central banks, in line with relevant Eurosystem arrangements (emergency liquidity assistance).

Riskkapital och välfärd

28 February, 2012 at 10:31 | Posted in Politics & Society | Comments Off on Riskkapital och välfärd


Vad är satir och vad är verklighet? Ibland undrar man. En som iallafall kan det här med satir är Max Gustafson. Lysande!

Dagens bristvara – ministrar med kompetens

27 February, 2012 at 20:58 | Posted in Economics | 8 Comments

Nya siffror från Arbetsförmedlingen visar att det idag är mindre än fyra av tio arbetslösa som får a-kassa. I en mer än vanligt förvirrande intervju i Ekot häromdagen kommenterade arbetsmarknadsminister Hillevi Engström siffrorna. Alltid läsvärda Storstad tog sig tid att kolla fakta – till skillnad från statsrådet som bara lät ankors plask och grodors plums avlösa varandra i en strid ström:

Hillevi Engström: “Min stora uppgift är ju att människor ska kunna komma in på arbetsmarknaden och få en egen försörjning.”

Facit: Arbetslösheten i september 2006, då den borgerliga regeringen tog över: 6,1 procent. I januari 2012: 8,0 procent.

Sysselsättningen har under samma period minskat från 65,8 procent till 63,7 procent. Källa: SCB/AKU.

Hillevi Engström: “Ytterst har vi ju ett skyddsnät i samhället som fungerar och så ska vi ha det.”

 Facit: 2006 hade 70 procent av de arbetslösa a-kassa. Idag är siffran 36 procent, rapporterar DN.

Arbetslöshets- och sjukförsäkringen fungerar inte längre som skyddsnät. Istället måste man vända sig till den sista utposten, socialbidraget: “Fyra av tio som får ekonomiskt bistånd är arbetssökande […] Personer som är sjukskrivna eller får sjuk-  eller aktivitetsersättning och behöver ekonomisk bistånd uppgår till 14  procent” skrev SKL i en rapport i oktober 2010. “De som inte kan försörja sig på grund av sociala problem, utgör bara  elva procent”.

Hillevi Engström: “Men själva arbetslöshetsförsäkringen är en inkomstbortfallsförsäkring som ska försäkra inkomsten mellan två olika arbeten.”

Facit: Sedan 2006 har andelen med a-kassa som får ut 80 procent av sin tidigare inkomst sjunkit från ca 40 procent till 12 procent.

Visst är det väl kanon att leva i ett land styrt av så kompetenta och pålästa ministrar? Eller?

Valfrihet

27 February, 2012 at 17:14 | Posted in Politics & Society | Comments Off on Valfrihet

Verklig frihet är inte att få välja.

Verklig frihet är att kunna välja de val som ska göras.

Tvätta öronen med Aurora

26 February, 2012 at 17:57 | Posted in Varia | Comments Off on Tvätta öronen med Aurora

I dessa tider – när ljudrummet dränks i den kommersiella radions tyckmyckentrutade ordbajseri och Melodifestivalens fullständigt intetsägande skval – har man ju nästan gett upp. Men det finns ljus i mörkret! I radions P2 går varje morgon ett vederkvickelsens och den seriösa musikens Aurora. Underbart. Så passa på och börja dagen med en musikalisk örontvätt och rensa hörselgångarna från kvarvarande musikslagg. I Aurora kan man till exempel lyssna på Arvo Pärts “Spiegel im spiegel” från hans “Portrait”. Att i åtta minuter få lyssna till sådan musik ger sinnet ro och får hoppet att återvända. Tack public-service-radio! Tack Arvo!

Visst äääälskar vi Melodifestivalen

25 February, 2012 at 23:20 | Posted in Varia | Comments Off on Visst äääälskar vi Melodifestivalen

“Musikunderhållning” är för många ett sätt att fly undan vardagen med syntetiska dagdrömmar. Med program som Melodifestivalen har kulturen avsvurit sig sin autonomi för att – som Adorno och Horkheimer uttrycker det – ”stolt inordna sig bland konsumtionsartiklarna.” Denna typ av “musikprogram” – där numera även tondöva författare kan göra sin lycka – är ett av de mest synliga tecknen på kulturens barbariska förfall i marknadens kolonisering av vår livsvärld. Här paraderar ytans non plus ultra som den odifferentierad smörja den är.

Tre miljoner flugor kan inte ha fel – ät skit!

Fun with Statistics

25 February, 2012 at 18:35 | Posted in Statistics & Econometrics | Comments Off on Fun with Statistics

When giving courses in statistics and econometrics I usually – especially on introductory and intermediate levels – encourage my students to use the web as a complement to the textbooks. A fun way to help you learn statistics is to use Youtube. Check out, for example, StatisticsFun, where you can watch instructive videos on regression analysis, hypothesis testing, anova and a lot more in the statistical toolbox.

Leibniz vs. Newton

25 February, 2012 at 10:08 | Posted in Theory of Science & Methodology | 3 Comments

Vinst i välfärden – redux

24 February, 2012 at 16:24 | Posted in Politics & Society | Comments Off on Vinst i välfärden – redux

Socialdemokratiska idé- och debatttidskriften Tiden har länge varit tråkig, ointressant och utan intellektuell spänst.
Men med den nye redaktören Daniel Suhonen verkar något ha hänt. Bra! Och den här bilden visar väl att man verkligen är beredd att ta striden med välfärdens dödgrävare …

Välfärd kan inte köpas

24 February, 2012 at 14:01 | Posted in Politics & Society | Comments Off on Välfärd kan inte köpas

(S)-kvinnornas ordförande Lena Sommestad hade häromdagen ett ur principiell synpunkt  intressant blogginlägg om huruvida välfärd är något som kan eller ska kunna köpas på en “välfärdsmarknad”.

Jag delar uppfattningen att bättre insyn krävs i välfärdsbolagen. Det vore utmärkt om vi hade både offentlighetsprincip och meddelarfrihet. Men jag delar inte uppfattningen att en ökad insyn är en tillräcklig åtgärd för att lösa de problem som skapas på de konkurrensutsatta välfärdsmarknaderna.

Min fråga är: ska kvaliteten på välfärdstjänsterna avgöras av den enskildes förmåga att kontrollera verksamheterna och deras kvalitet? Är det verkligen den som är bäst på att välja som ska få den bästa skolan och vården?

Min uppfattning är att medborgare i Sverige ska erbjudas utbildning, vård och omsorg som är likvärdig och av god kvalitet, oavsett vilka resurser och vilken kompetens som den enskilde medborgaren besitter. Välfärden är en social rättighet, inte en tjänst vars kvalitet ska bero på hur duktig och välinformerad du är som kund på välfärdsmarknaden.

Probabilistic econometrics – science without foundations (part I)

21 February, 2012 at 15:31 | Posted in Statistics & Econometrics, Theory of Science & Methodology | 5 Comments

Modern probabilistic econometrics relies on the notion of probability. To at all be amenable to econometric analysis, economic observations allegedly have to be conceived as random events.

But is it really necessary to model the economic system as a system where randomness can only be analyzed and understood when based on an a priori notion of probability?

 

Where do probabilities come from?

In probabilistic econometrics, events and observations are as a rule interpreted as random variables as if generated by an underlying probability density function, and a fortiori – since probability density functions are only definable in a probability context – consistent with a probability. As Haavelmo (1944:iii) has it:

For no tool developed in the theory of statistics has any meaning – except , perhaps for descriptive purposes – without being referred to some stochastic scheme.

When attempting to convince us of the necessity of founding empirical economic analysis on probability models, Haavelmo – building largely on the earlier Fisherian paradigm – actually forces econometrics to (implicitly) interpret events as random variables generated by an underlying probability density function.

This is at odds with reality. Randomness obviously is a fact of the real world. Probability, on the other hand, attaches to the world via intellectually constructed models, and a fortiori is only a fact of a probability generating  machine or a well constructed experimental arrangement or “chance set-up”.

Just as there is no such thing as a “free lunch,” there is no such thing as a “free probability.” To be able at all to talk about probabilities, you have to specify a model. If there is no chance set-up or model that generates the probabilistic outcomes or events – in statistics one refers to any process where you observe or measure as an experiment (rolling a die) and the results obtained as the outcomes or events (number of points rolled with the die, being e. g. 3 or 5) of the experiment –there strictly seen is no event at all.

Probability is a relational element. It always must come with a specification of the model from which it is calculated. And then to be of any empirical scientific value it has to be shown to coincide with (or at least converge to) real data generating processes or structures – something seldom or never done!

And this is the basic problem with economic data. If you have a fair roulette-wheel, you can arguably specify probabilities and probability density distributions. But how do you conceive of the analogous nomological machines for prices, gross domestic product, income distribution etc? Only by a leap of faith. And that does not suffice. You have to come up with some really good arguments if you want to persuade people into believing in the existence of socio-economic structures that generate data with characteristics conceivable as stochastic events portrayed by probabilistic density distributions!

From a realistic point of view we really have to admit that the socio-economic states of nature that we talk of in most social sciences – and certainly in econometrics – are not amenable to analyze as probabilities, simply because in the real world open systems that social sciences – including econometrics – analyze, there are no probabilities to be had!

The processes that generate socio-economic data in the real world cannot just be assumed to always be adequately captured by a probability measure. And, so, it cannot really be maintained – as in the Haavelmo paradigm of probabilistic econometrics – that it even should be mandatory to treat observations and data – whether cross-section, time series or panel data – as events generated by some probability model. The important activities of most economic agents do not usually include throwing dice or spinning roulette-wheels. Data generating processes – at least outside of nomological machines like dice and roulette-wheels – are not self-evidently best modeled with probability measures.

If we agree on this, we also have to admit that probabilistic econometrics lacks a sound justification. I would even go further and argue that there really is no justifiable rationale at all for this belief that all economically relevant data can be adequately captured by a probability measure. In most real world contexts one has to argue one’s case. And that is obviously something seldom or never done by practitioners of probabilistic econometrics.

 

What is randomness?

Econometrics and probability are intermingled with randomness. But what is randomness?

In probabilistic econometrics it is often defined with the help of independent trials – two events are said to be independent if the occurrence or nonoccurrence of either one has no effect on the probability of the occurrence of the other – as drawing cards from a deck, picking balls from an urn, spinning a roulette wheel or tossing coins – trials which are only definable if somehow set in a probabilistic context.

But if we pick a sequence of prices – say 2, 4, 3, 8, 5, 6, 6 – that we want to use in an econometric regression analysis, how do we know the sequence of prices is random and a fortiori being able to treat as generated by an underlying probability density function? How can we argue that the sequence is a sequence of probabilistically independent random prices? And are they really random in the sense that is most often applied in probabilistic econometrics – where X is called a random variable only if there is a sample space S with a probability measure and X is a real-valued function over the elements of S?

Bypassing the scientific challenge of going from describable randomness to calculable probability by just assuming it, is of course not an acceptable procedure. Since a probability density function is a “Gedanken” object that does not exist in a natural sense, it has to come with an export license to our real target system if it is to be considered usable.

Among those who at least honestly try to face the problem – the usual procedure is to refer to some artificial mechanism operating in some “games of chance” of the kind mentioned above and which generates the sequence. But then we still have to show that the real sequence somehow coincides with the ideal sequence that defines independence and randomness within our – to speak with science philosopher Nancy Cartwright (1999) – “nomological machine”, our chance set-up, our probabilistic model.

As the originator of the Kalman filter, Rudolf Kalman (1994:143), notes:

Not being able to test a sequence for ‘independent randomness’ (without being told how it was generated) is the same thing as accepting that reasoning about an “independent random sequence” is not operationally useful.

So why should we define randomness with probability? If we do, we have to accept that to speak of randomness we also have to presuppose the existence of nomological probability machines, since probabilities cannot be spoken of – and actually, to be strict, do not at all exist – without specifying such system-contexts (how many sides do the dice have, are the cards unmarked, etc)
If we do adhere to the Fisher-Haavelmo paradigm of probabilistic econometrics we also have to assume that all noise in our data is probabilistic and that errors are well-behaving, something that is hard to justifiably argue for as a real phenomena, and not just an operationally and pragmatically tractable assumption.

Maybe Kalman’s (1994:147) verdict that

Haavelmo’s error that randomness = (conventional) probability is just another example of scientific prejudice

is, from this perspective seen, not far-fetched.

Accepting Haavelmo’s domain of probability theory and sample space of infinite populations– just as Fisher’s (1922:311) “hypothetical infinite population, of which the actual data are regarded as constituting a random sample”, von Mises’ “collective” or Gibbs’ ”ensemble” – also implies that judgments are made on the basis of observations that are actually never made!
Infinitely repeated trials or samplings never take place in the real world. So that cannot be a sound inductive basis for a science with aspirations of explaining real-world socio-economic processes, structures or events. It’s not tenable.

As David Salsburg (2001:146) notes on probability theory:

[W]e assume there is an abstract space of elementary things called ‘events’ … If a measure on the abstract space of events fulfills certain axioms, then it is a probability. To use probability in real life, we have to identify this space of events and do so with sufficient specificity to allow us to actually calculate probability measurements on that space … Unless we can identify [this] abstract space, the probability statements that emerge from statistical analyses will have many different and sometimes contrary meanings.

Just as e. g. Keynes (1921) and Georgescu-Roegen (1971), Salsburg (2001:301f) is very critical of the way social scientists – including economists and econometricians – uncritically and without arguments have come to simply assume that one can apply probability distributions from statistical theory on their own area of research:

Probability is a measure of sets in an abstract space of events. All the mathematical properties of probability can be derived from this definition. When we wish to apply probability to real life, we need to identify that abstract space of events for the particular problem at hand … It is not well established when statistical methods are used for observational studies … If we cannot identify the space of events that generate the probabilities being calculated, then one model is no more valid than another … As statistical models are used more and more for observational studies to assist in social decisions by government and advocacy groups, this fundamental failure to be able to derive probabilities without ambiguity will cast doubt on the usefulness of these methods.

Some wise words that ought to be taken seriously by probabilistic econometricians is also given by mathematical statistician Gunnar Blom (2004:389):

If the demands for randomness are not at all fulfilled, you only bring damage to your analysis using statistical methods. The analysis gets an air of science around it, that it does not at all deserve.

Richard von Mises (1957:103) noted that

Probabilities exist only in collectives … This idea, which is a deliberate restriction of the calculus of probabilities to the investigation of relations between distributions, has not been clearly carried through in any of the former theories of probability.

And obviously not in Haavelmo’s paradigm of probabilistic econometrics either. It would have been better if one had heeded von Mises warning (1957:172) that

the field of application of the theory of errors should not be extended too far.

This importantly also means that if you cannot show that data satisfies all the conditions of the probabilistic nomological machine – including e. g. the distribution of the deviations corresponding to a normal curve – then the statistical inferences used, lack sound foundations! And this really is the basis of the argument put forward in this essay – probabilistic econometrics lacks sound foundations.

.

References

Gunnar Blom et al: Sannolikhetsteori och statistikteori med tillämpningar, Lund: Studentlitteratur.

Cartwright, Nancy (1999), The Dappled World. Cambridge: Cambridge University Press. 

Fisher, Ronald (1922), On the mathematical foundations of theoretical statistics. Philosophical Transactions of The Royal Society A, 222.

Georgescu-Roegen, Nicholas (1971), The Entropy Law and the Economic Process. Harvard University Press.

Haavelmo, Trygve  (1944), The probability approach in econometrics. Supplement to Econometrica 12:1-115.  

Kalman, Rudolf (1994), Randomness Reexamined. Modeling, Identification and Control  3:141-151.

Keynes, John Maynard  (1973 (1921)), A Treatise on Probability. Volume VIII of The Collected Writings of John Maynard Keynes, London: Macmillan.

Pålsson Syll, Lars (2007), John Maynard Keynes. Stockholm: SNS Förlag.

Salsburg, David (2001), The Lady Tasting Tea. Henry Holt.

von Mises, Richard (1957), Probability, Statistics and Truth. New York: Dover Publications.

Visst var hon en “jävla kärring”

19 February, 2012 at 10:47 | Posted in Varia | 2 Comments

Mikael Wiehe kunde inte riktigt hålla sig i Nyhetsmorgon idag. Vid tanken på järnladyns famösa “There is no such thing as a society” rann det över och han var bara tvungen att tillägga – “Jävla kärring”.

Och visst har han också i sak rätt. För vad annat kan man tycka om diktaturkramaren Margaret Thatcher? Möjligen hade Wiehe också annat i tankarna. Som Thatchers ohöljda förmåga att titta åt andra hållet när det gällde apartheid i Sydafrika och diktaturens Chile. Eller att hon förespråkade återinförandet av dödsstraff. Eller införandet av lagstiftning som förbjöd myndigheter att framställa homosexualitet på ett positivt sätt. Eller att superladyn under sina elva år vid makten utsåg ett enda kvinnligt  statsråd. Eller hennes hatiska kamp mot facken.

Wiehe har väl aldrig varit någon musikalisk favorit för yours truly, men just idag lyssnar jag gärna en stund på honom:

Randomness and ergodic theory in economics – what went wrong?

18 February, 2012 at 12:23 | Posted in Statistics & Econometrics, Theory of Science & Methodology | 6 Comments

Ergodicity is a difficult concept that many students of economics have problems with understanding. In the very instructive video below, Ole Peters – from the Department of Mathematics at the Imperial College of London – has made an admirably simplified and pedagogical exposition of what it means for probability structures of stationary processses and ensembles to be ergodic. Using a progression of simulated coin flips, his example shows the all-important difference between time averages and ensemble averages for this kind of processes:

To understand real world ”non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past are not those that will rule the future.

When we cannot accept that the observations, along the time-series available to us, are independent … we have, in strict logic, no more than one observation, all of the separate items having to be taken together. For the analysis of that the probability calculus is useless; it does not apply … I am bold enough to conclude, from these considerations that the usefulness of ‘statistical’ or ‘stochastic’ methods in economics is a good deal less than is now conventionally supposed … We should always ask ourselves, before we apply them, whether they are appropriate to the problem in hand. Very often they are not … The probability calculus is no excuse for forgetfulness.

John Hicks, Causality in Economics, 1979:121

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

Added I:Some of my readers have asked why the difference between ensemble and time averages is of importance. Well, basically, because when you assume the processes to be ergodic,ensemble and time averages are identical. Let me giva an example even simpler than the one Peters gives:

Assume we have a market with an asset priced at 100 €. Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be 100 €- because we here envision two parallel universes (markets) where the assetprice falls in one universe (market) with 50% to 50 €, and in another universe (market) it goes up with 50% to 150 €, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the assetprice first rises by 50% to 150 €, and then falls by 50% to 75 € (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen.

Assuming ergodicity there would have been no difference at all.

Added II: Just in case you think this is just an academic quibble without repercussion to our real lives, let me quote from an article of Peters in the Santa Fe Institute Bulletin from 2009 – On Time and Risk – that makes it perfectly clear that the flaw in thinking about uncertainty in terms of “rational expectations” and ensemble averages has had real repercussions on the functioning of the financial system:

In an investment context, the difference between ensemble averages and time averages is often small. It becomes important, however, when risks increase, when correlation hinders diversification, when leverage pumps up fluctuations, when money is made cheap, when capital requirements are relaxed. If reward structures—such as bonuses that reward gains but don’t punish losses, and also certain commission schemes—provide incentives for excessive risk, problems arise. This is especially true if the only limits to risk-taking derive from utility functions that express risk preference, instead of the objective argument of time irreversibility. In other words, using the ensemble average without sufficiently restrictive utility functions will lead to excessive risk-taking and eventual collapse. Sound familiar?

Added III: Still having problems understanding the ergodicity concept? Let me cite one last example that hopefully will make the concept more accessible on an intuitive level:

Why are election polls often inaccurate? Why is racism wrong? Why are your assumptions often mistaken? The answers to all these questions and to many others have a lot to do with the non-ergodicity of human ensembles. Many scientists agree that ergodicity is one of the most important concepts in statistics. So, what is it?

Suppose you are concerned with determining what the most visited parks in a city are. One idea is to take a momentary snapshot: to see how many people are this moment in park A, how many are in park B and so on. Another idea is to look at one individual (or few of them) and to follow him for a certain period of time, e.g. a year. Then, you observe how often the individual is going to park A, how often he is going to park B and so on.

Thus, you obtain two different results: one statistical analysis over the entire ensemble of people at a certain moment in time, and one statistical analysis for one person over a certain period of time. The first one may not be representative for a longer period of time, while the second one may not be representative for all the people. The idea is that an ensemble is ergodic if the two types of statistics give the same result. Many ensembles, like the human populations, are not ergodic.

Yours truly goes international

18 February, 2012 at 09:55 | Posted in Economics, Theory of Science & Methodology | Comments Off on Yours truly goes international

I tisdags lade yours truly ut en artikel här på bloggen om en av hörnstenarna i modern neoklassisk makroekonomisk teori – hypotesen om rationella förväntningar. Artikeln – David K. Levine is totally wrong on the rational expectations hypothesis – rönte internationell uppmärksamhet och publicerades i förrgår på Real-World Economics Review. För de med intresse för ekonomisk teori (jo, tro det eller ej, men de finns) kan den efterföljande diskussionen följas här.

Socialdemokratiska självmål på självmål i välfärdsdebatten

18 February, 2012 at 09:05 | Posted in Politics & Society | Comments Off on Socialdemokratiska självmål på självmål i välfärdsdebatten

Vinstdrivande företag inom vård- och skolsektor har diskuterats mycket det senaste halvåret. Många är med rätta upprörda och man skulle kunna tycka att den socialdemokratiska oppositionen  – inte minst mot bakgrund av skandalerna runt Carema – här haft ett gyllene tillfälle att tydligt och klart tala om att man nu vill se till att undanröja möjligheterna för vinstdrivande bolag att verka inom vård, omsorg och skola.

Men så blir det inte. Partiet fortsätter svaja istället. Vid en presskonferens igår sade partisekreteraren Carin Jämtin:

Vi vill ge kommunerna möjlighet att gynna icke-vinstdrivande bolagsformer. Vi vill ge kommunerna den makten.

Kommuner och landsting ska alltså själva kunna välja om välfärden ska drivas med eller utan vinst på våra skattepengar.

Hur kompetenta kommunerna är att lösa den typen av frågor har vi alltför väl kunnat se i skolan, där det kommunala huvudmannaskapet haft fullständigt förödande effekter.  

Skattefinansierade vinstdrivande företag ska således kunna få finnas kvar inom nämnda sektorer. Det hade varit bättre om partiet istället för detta lösa och till intet förpliktigande prat klart och tydligt deklarerat att vinstdrivande bolag inom vård, omsorg och skola inte hör hemma i ett svenskt välfärdssamhälle.

Många som är verksamma inom skolvärlden eller vårdsektorn har haft svårt att förstå socialdemokratins inställning till privatiseringar och vinstuttag i den mjuka välfärdssektorn. Av någon outgrundlig anledning har ledande socialdemokrater under många år pläderat för att vinster ska vara tillåtna i skolor och vårdföretag. Ofta har argumentet varit att driftsformen inte har någon betydelse. Så är inte fallet. Driftsform och att tillåta vinst i välfärden har visst betydelse. Och den är negativ.

Socialdemokratin är förvisso långt ifrån ensamt om sitt velande. På den andra kanten hörs från Svenskt Näringsliv och landets alla ledarskribenter en jämn ström av krav på ökad kontroll, tuffare granskning och inspektioner.

Men vänta lite nu! Var det inte så att när man på 1990-talet påbörjade systemskiftet inom välfärdssektorn ofta anförde som argument för privatiseringarna att man just skulle slippa den byråkratiska logikens kostnader i form av regelverk, kontroller och uppföljningar? Konkurrensen – denna marknadsfundamentalismens panacé – skulle ju göra driften effektivare och höja verksamheternas kvalitet. Marknadslogiken skulle tvinga bort de “byråkratiska” och tungrodda offentliga verksamheterna och kvar skulle bara finnas de bra företagen som “valfriheten” möjliggjort.

Och nu när den panglossianska privatiseringsvåtdrömmen visar sig vara en mardröm så ska just det som man ville bli av  med – regelverk och “byråkratisk” tillsyn och kontroll – vara lösningen?

Man tar sig för pannan – och det av många skäl!

För ska man genomföra de åtgärdspaket som förs fram undrar man ju hur det går med den där effektivitetsvinsten. Kontroller, uppdragsspecifikationer, inspektioner m m kostar ju pengar och hur mycket överskott blir det då av privatiseringarna när dessa kostnader också ska räknas hem i kostnads- intäktsanalysen? Och hur mycket värd är den där “valfriheten” när vi ser hur den gång på gång bara resulterar i verksamhet där vinst genereras genom kostnadsnedskärningar och sänkt kvalitet?  

All form av ekonomisk verksamhet bygger på eller inbegriper någon form av delegering. En part (uppdragsgivaren, principalen, beställaren) vill att en annan part (uppdragstagaren, agenten, utföraren) ska utföra en viss uppgift. Grundproblemet är hur beställaren ska få utföraren att utföra uppdraget på det sätt som beställaren önskar …

 Det finns en uppenbar fara i att basera ersättningssystem på enkla objektiva mått när det vi vill ersätta i själva verket har flera och komplexa dimensioner, exempelvis ersättning efter antal utskrivna patienter, lärarlöner kopplade till betyg eller dylikt. Ofta har kommunala verksamheter denna karaktär av “fleruppgiftsverkamhet” och då fungerar ofta inte incitamentkontrakt eller provisioner. I sådana fall kan “byråkratier” vara mer ändamålsenliga än marknader …

 Effektiv resursanvändning kan aldrig vara ett mål i sig. Däremot kan det vara ett nödvändigt medel för att nå uppsatta mål. Välfärdsstatens vara eller icke vara är därför i grunden inte bara en fråga om ekonomisk effektivitet, utan också om våra föreställningar om ett värdigt liv, rättvisa och lika behandling.

 Lars Pålsson Syll et al, Vad bör kommunerna göra? (Jönköping University Press, 2002)

Så grundfrågan är inte om skattefinansierade privata företag ska få göra vinstuttag eller om det krävs hårdare tag i form av kontroll och inspektion. Grundfrågan är om det är marknadens och privatiseringarnas logik som ska styra våra välfärdsinrättningar eller om det ske via “byråkratins” logik.  Grundfrågan handlar om den gemensamma välfärdssektorn ska styras av demokrati och politik eller av marknaden.

Därför levererar inte Jan Björklunds skolpolitik!

17 February, 2012 at 15:06 | Posted in Education & School | 5 Comments

Utredarna.nu har några tänkvärda funderingar kring varför Jan Björklunds skolpolitik hitintills inte verkar ha gett några uppenbara positiva resultat:

Utbildningsminister Jan Björklund har varit ansvarig för skolpolitiken i snart sex år och har därmed ett betydande ansvar för skolreformernas konsekvenser. Men enligt utbildningsministern är det är för tidigt att utvärdera reformerna. Han brukar säga att ”det är som att vända en atlantångare”. Men enligt [ McKinsey-rapporten ] går det att åstadkomma förändringar i skolresultat på nationell nivå på sex år eller mindre. Det är med andra ord snart upp till bevis.

Det vi vet i dagsläget är att kungstankarna i regeringens skolpolitik inte vilar på vetenskaplig eller erfarenhetsmässig grund. Vi vet att Sverige sjunker i internationella jämförelser och att skillnaderna mellan svenska skolor och elever ökar. Vi vet att elevernas socioekonomiska bakgrund slår igenom alltmer i skolresultat och skolval och nu. Vi vet att snedrekryteringen till högskolan består och kanske också förstärks. Vi vet att lärarnas administrativa börda ökat och att antalet arbetsmiljölarm fördubblats på fem år. Vi vet att allt färre nybörjare antas till lärarutbildningen.

Ingenting tyder alltså på att reformerna ger det önskade resultatet. Skolpolitiken famlar i blindo. Om den inte tar av sig skygglapparna går den in i väggen.

Till detta kan man tillägga att ett lärarutbildningssystem där antalet förstahandssökande till lärarutbildningen är så katastrofalt lågt som  1,2 per plats – som det nu är i Sverige – knappast är optimalt för att vaska fram de bästa lärarna. Snarare visar det att Sverige närmar sig en nationell lärarkris. Den främsta bakomliggande orsaken till detta är de skamligt låga lärarlönerna, som bidragit till att intresset bland de duktigaste studenterna för att gå en lärarutbildning idag är nästintill lika med noll.

Ska läraryrket bli intressant för de bästa studenterna måste lönen upp rejält. Och lärare måste få kunna koncentrera sig på att göra det de i första hand är tänkta att göra – att undervisa! Utan dessa förändringar riskerar alla de reformer som den svenska skolan nu genomgår att bli fullständigt verkningslösa.

Den (euro)peiska enheten

17 February, 2012 at 13:46 | Posted in Economics | Comments Off on Den (euro)peiska enheten

Rational expectations and ergodocity

17 February, 2012 at 11:10 | Posted in Statistics & Econometrics, Theory of Science & Methodology | 1 Comment

Ergodicity is a difficult concept that many students of econom(etr)ics have problems with understanding. Trying to explain it, you often find yourself getting lost in mathematical-statistical subtleties difficult for most students to grasp.

In What is ergodicity? Vlad Tarko has made an admirably simplified and pedagogical exposition of what it means for probability structures of stationary processses and ensembles to be ergodic.

After reading that you can go to – tougher reading, but still fully accessible – Paul Davidson’s superb articles Rational expectations: a fallacious foundation for studying crucial decision-making processes and Can future systemic financial risks be quantified? Ergodic vs nonergodic stochastic processes.

Then ask yourself: how can anyone take the rational expectations hypothesis seriously?

As Sir John Hicks had it already in Economic Perspectives (1977, vii):

One must assume that people in one’s models do not know what is going to happen, and know that they do not know what is going to happen. As in history.

And yet neoclassical economists still abstract from historical time and uncertainty, willfully ignoring that agents have different ways of approaching and coping with problems of non-routine decision-making and change in the absence of perfect knowledge, and that their heterogeneous expectations lead them to make different choices than the ergodic-rational-expectations models suggest.

The conclusion has to be – as in Roman Frydman’s and Michael Goldberg’s The Imperfect Knowledge Imperative in Modern Macroeconomics and Finance Theory – the rational expectations narrative

has no foundations.

If macroeconomics has to have microeconomic foundations, they have to be found elsewhere!

Boräntorna föjer inte med ner – bankerna skor sig på Riksbankens räntesänkningar

17 February, 2012 at 08:37 | Posted in Economics | Comments Off on Boräntorna föjer inte med ner – bankerna skor sig på Riksbankens räntesänkningar

Häromdagen kund vi i SvD Näringsliv läsa om hur bankerna kraftigt ökar ersättningen till aktieägarna medan bolånekunderna får betala mer när bankerna ökar sina bolånemarginaler. I en kommentar i dagens SvD Näringsliv säger kommunikationschefen på Swedbank att

 Vi måste ta skäligt betalt 

Hmm … 

Låt oss titta på räntegapet (skillnaden mellan Riksbankens reporänta och Swedbanks 3 månaders boränta):

Bankerna har alltså fått lägre kostnader. Räntegapet har fullständigt exploderat. Och Swedbanks kommunikationschef pratar om skälig betalning. Man tar sig för pannan! Hur dumma tror de här bankmänniskorna folk är egentligen?

Dyster framtid för svensk arbetsmarknad – varslen upp 65%

16 February, 2012 at 13:19 | Posted in Economics | Comments Off on Dyster framtid för svensk arbetsmarknad – varslen upp 65%

Varslen duggar numera tätt på svensk arbetsmarknad. I januari varslades 6 600 personer – en ökning med 65 % jämfört med samma månad i fjol.

Som om det inte var nog med detta visar också färsk statistik att sysselsättningsgraden minskar och att antalet nyinskrivningar vid Arbetsförmedlingen ökar. Antalet arbetslösa eller programaktiva är snart uppe i en halv miljon personer.

Med en arbetslöshet på god väg mot 8% och en ungdomsarbetslöshet parkerad på över 20% verkar som om det snart bara är regeringen som inte kommer att riskera bli arbetslös den närmsta framtiden. Lyckas man inte ändra på det, lär dess medlemmar dock få se sig om efter nya jobb efter valet 2014.

Inget prislyft på bostäder? Nordeas chefsekonom Annika Winsth verkar ha noll koll på läget!

15 February, 2012 at 08:10 | Posted in Economics | 7 Comments

I dagens Svenska Dagbladet hävdar Nordeas chefsekonom Annika Winsth att prislyftet på bostäder de senaste årtiondena inte alls är oroande:

Man ska komma ihåg att prisuppgången kommer från väldigt låga nivåer och efter en kraftig nedgång i början av 1990-talet. 

Hmm … 

Efter att ha tagit del av diagrammet nedan har man minst sagt svårt att tro att så skulle vara fallet:

             Källa: SCB och egna beräkningar

Intressant nog kan vi några sidor längre fram i SvD Näringsliv läsa om hur bankerna kraftigt ökar ersättningen till aktieägarna medan bolånekunderna får betala mer när bankerna ökar sina bolånemarginaler. Visst får man en olustig känsla av att den här typen av desinformation handlar om att vissa marknadsaktörer  har vested interests i uppblåsta bostadsbubblor, eller?

David K Levine is totally wrong on the rational expectations hypothesis

14 February, 2012 at 17:04 | Posted in Economics | 1 Comment

In the wake of the latest financial crisis many people have come to wonder why economists never have been able to predict these manias, panics and crashes that haunt our economies.

In responding to these warranted wonderings, some economists – like renowned theoretical economist David K Levine in the article Why Economists Are Right: Rational Expectations and the Uncertainty Principle in Economics in the Huffington Post – have maintained that

it is a fundamental principle that there can be no reliable way of predicting a crisis.

To me this is a totally inadequate answer. And even trying to make an honour out of the inability of one’s own science to give answers to just questions, is indeed proof of a rather arrogant and insulting attitude.

The main reason Levine gives for his view is what he calls “the uncertainty principle in economics” and the “theory of rational expectations”:

In simple language what rational expectations means is ‘if people believe this forecast it will be true.’ By contrast if a theory is not one of rational expectations it means ‘if people believe this forecast it will not be true.’ Obviously such a theory has limited usefulness. Or put differently: if there is a correct theory, eventually most people will believe it, so it must necessarily be rational expectations. Any other theory has the property that people must forever disbelieve the theory regardless of overwhelming evidence — for as soon as the theory is believed it is wrong.

So does the crisis prove that rational expectations and rational behavior are bad assumptions for formulating economic policy? Perhaps we should turn to behavioral models of irrationality in understanding how to deal with the housing market crash or the Greek economic crisis? Such an alternative would have us build on foundations of sand. It would have us create economic policies and institutions with the property that as soon as they were properly understood they would cease to function.

These are rather preposterous allegations. To my knowledge, there is nobody among us economists that really advocates constructing models based on irrational expectations. And very few of us are not aware of the of effects that economic theory can have on the behaviour of economic actors.

So, to put it bluntly, Levine has totally failed to give a fair view of the state of play among contemporary economists on the issue of rational expectations. Let me try to sort it out just a little.

 

Rational expectations – a concept with a history

The concept of rational expectations was first developed by John Muth (1961) and later applied to macroeconomics by Robert Lucas (1972). In this way the concept of uncertainty as developed by Keynes (1921) and Knight (1921) was turned into a concept of quantifiable risk in the hands of neoclassical economics.

Muth (1961:316) framed his rational expectations hypothesis (REH) in terms of probability distributions:

Expectations of firms (or, more generally, the subjective probability distribution of outcomes) tend to be distributed, for the same information set, about the prediction of the theory (or the “objective” probability distributions of outcomes).

But Muth (1961:317) was also very open with the non-descriptive character of his concept:

The hypothesis of rational expectations] does not assert that the scratch work of entrepreneurs resembles the system of equations in any way; nor does it state that predictions of entrepreneurs are perfect or that their expectations are all the same.

To Muth its main usefulness was its generality and ability to be applicable to all sorts of situations irrespective of the concrete and contingent circumstances at hand.

While Muth’s concept was later picked up by new classical macroeconomics in the hands of people like Robert Lucas and Eugene Fama, most of us thought it was such a patently ridiculous idea, that we had problems with really taking it seriously.

It is noteworthy that Lucas (1972) did not give any further justifications for REH, but simply applied it to macroeconomics. In the hands of Lucas and Sargent it was used to argue that government could not really influence the behavior of economic agents in any systematic way. In the 1980s it became a dominant model-assumption in the New Classical Macroeconomic models and has continued to be a standard assumption made in many neoclassical (macro)economic models – most notably in the fields of (real) business cycles and finance (being a cornerstone in the “efficient market hypothesis”).

 

Keynes, genuine uncertainty and ergodicity

REH basically says that people on the average hold expectations that will be fulfilled. This makes the economist’s analysis enormously simplistic, since it means that the model used by the economist is the same as the one people use to make decisions and forecasts of the future.

The REH view is very different to the one we connect with John Maynard Keynes. According to Keynes (1937:113) we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often force us to make decisions based on anything but rational expectations. Sometimes we “simply do not know.”

Keynes would not have accepted Muth’s view that expectations “tend to be distributed, for the same information set, about the prediction of the theory.” Keynes, rather, thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief”, beliefs that have preciously little to do with the kind of stochastic probabilistic calculations made by the rational expectations agents modeled by Lucas et consortes.

In the real world, set in non-ergodic historical time, the future is to a large extent unknowable and uncertain.

REH only applies to ergodic – stable and stationary stochastic – processes. Economies in the real world are nothing of the kind. If the world was ruled by ergodic processes – a possibility utterly incompatible with the views of Keynes – people could perhaps have rational expectations, but no convincing arguments have ever been put forward, however, for this assumption being realistic – and this goes for Levine too.

REH holds the view that people, on average, have the same expectations. Keynes, on the other hand, argued convincingly that people often have different expectations and informations, which constitutes the basic rational behind macroeconomic needs of coordination. Something that is rather swept under the rug by the extremely simple-mindedness of assuming rational expectations in representative actors models, which is so in vogue in New Classical Economics. But if all actors are alike, why do they transact? Who do they transact with? The very reason for markets and exchange seems to slip away with the sister assumptions of representative actors and rational expectations.

 

Mathematical tractability is not enough

In my view it is an enormous waste of intellectual power to build these kinds of models based on useless theories. Their marginal utility have long since passed over into the negative. That people are still more or less mindlessly doing this is a sign of an incredible intellectual hubris.

It would be far better to admit that we simply do not know about lots of different things, and that we should try to do as good as possible given this, rather than looking the other way and pretend that we are all-knowing rational calculators.

Models based on REH impute beliefs to the agents that is not based on any real informational considerations, but simply stipulated to make the models mathematically-statistically tractable.

Of course you can make assumptions based on tractability, but then you do also have to take into account the necessary trade-off in terms of the ability to make relevant and valid statements on the intended target system.

Mathematical tractability cannot be the ultimate arbiter in science when it comes to modeling real world target systems. Of course, one could perhaps accept REH if it had produced lots of verified predictions and good explanations. But it has done nothing of the kind. Therefore the burden of proof is on those, like Levine, who still want to use models built on ridiculously unreal assumptions – models devoid of all empirical interest.

In reality, REH is a rather harmful modeling assumption, since it contributes to perpetuating the ongoing transformation of economics into a kind of science-fiction-economics. If economics is to guide us, help us make forecasts, explain or better understand real world phenomena, it is in fact next to worthless.

 

Learning and information

REH presupposes – basically for reasons of consistency – that agents have complete knowledge of all of the relevant probability distribution functions. And when trying to incorporate learning in these models – trying to take the heat of some of the criticism launched against it up to date – it is always a very restricted kind of learning that is considered. A learning where truly unanticipated, surprising, new things never take place, but only rather mechanical updatings – increasing the precision of already existing information sets – of existing probability functions.

Nothing really new happens in these ergodic models, where the statistical representation of learning and information is nothing more than a caricature of what takes place in the real world target system. This follows from taking for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution) that can be thought of taking place.

But in the real world it is – as shown again and again by behavioural and experimental economics – common to mistake a conditional distribution for a probability distribution. Mistakes that are impossible to make in the kinds of economic analysis – built on REH – that Levine is such an adamant propagator for. On average REH agents are always correct. But truly new information will not only reduce the estimation error but actually change the entire estimation and hence possibly the decisions made. To be truly new, information has to be unexpected. If not, it would simply be inferred from the already existing information set.

In REH models new information is typically presented as something only reducing the variance of the parameter estimated. But if new information means truly new information it actually could increase our uncertainty and variance (information set (A, B) => (A, B, C)).

Truly new information give birth to new probabilities, revised plans and decisions – something the REH cannot account for with its finite sampling representation of incomplete information.

In the world of REH, learning is like being better and better at reciting the complete works of Shakespeare by heart – or at hitting bull’s eye when playing dart. It presupposes that we have a complete list of the possible states of the world and that by definition mistakes are non-systematic (which, strictly seen, follows from the assumption of “subjective” probability distributions being equal to the “objective” probability distribution). This is a rather uninteresting and trivial kind of learning. It is a closed world learning, synonymous to improving one’s adaptation to a world which is fundamentally unchanging. But in real, open world situations, learning is more often about adapting and trying to cope with genuinely new phenomena.

REH presumes consistent behaviour, where expectations do not display any persistent errors. In the world of REH we are always, on average, hitting the bull’s eye. In the more realistic, open systems view, there is always the possibility (danger) of making mistakes that may turn out to be systematic. It is because of this, presumably, that we put so much emphasis on learning in our modern knowledge societies.

 

On risk, uncertainty and probability distributions

REH assumes that the expectations based on “objective” probabilities are the same as the “subjective” probabilities that agents themselves form on uncertain events. It treats risk and uncertainty as equivalent entities.

But in the real world, it is not possible to just assume that probability distributions are the right way to characterize, understand or explain acts and decisions made under uncertainty. When we simply do not know, when we have not got a clue, when genuine uncertainty prevail, REH simply will not do. In those circumstances it is not a useful assumption. The reason is that under those circumstances the future is not like the past, and henceforth, we cannot use the same probability distribution – if it at all exists – to describe both the past and future.

There simply is no guarantee that probabilities at time 1 are the same as those at time 2. So when REH assumes that the parameter values on average are the same for the future and the past, one is – as Roman Frydman and Michael Goldberg (2007) forcefully argue – not really talking about uncertainty, but rather knowledge. But this implies that what we observe are realizations of pure stochastic processes, something, if we want to maintain this view, we really have to argue for.

In physics it may possibly not be straining credulity too much to model processes as ergodic – where time and history do not really matter – but in social and historical sciences it is obviously ridiculous. If societies and economies were ergodic worlds, why do econometricians fervently discuss things such as structural breaks and regime shifts? That they do is an indication of the unrealisticness of treating open systems as analyzable with ergodic concepts.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be.

We have to surpass REH and try to build economics on a more realistic foundation. A foundation that encompasses both ergodic and non-ergodic processes, both risk and genuine uncertainty. Reading Levine one comes to think of Robert Clower’s (1989:23) apt remark that

much economics is so far removed from anything that remotely resembles the real world that it’s often difficult for economists to take their own subject seriously.

 

Where is the evidence?

Instead of assuming REH to be right, one ought to confront the hypothesis with the available evidence. It is not enough to construct models. Anyone can construct models. To be seriously interesting, models have to come with an aim. They have to have an intended use. If the intention of REH is to help us explain real economies, it has to be evaluated from that perspective. A model or hypothesis without a specific applicability is not really deserving our interest.

To say, as Prescott (1977:30) that

one can only test if some theory, whether it incorporates rational expectations or, for that matter, irrational expectations, is or is not consistent with observations

is not enough. Without strong evidence all kinds of absurd claims and nonsense may pretend to be science. We have to demand more of a justification than this rather watered-down version of “anything goes” when comes to rationality postulates. If one proposes REH one also has to support its underlying assumptions. None is given, which makes it rather puzzling how REH has become the standard modeling assumption made in much of modern macroeconomics. Perhaps the reason is, as Paul Krugman (2009) has it, that economists often mistake

beauty, clad in impressive looking mathematics, for truth.

But I think Prescott’s view is also the reason why REH economists are not particularly interested in empirical examinations of how real choices (cf Levine’s rather derogatory remarks on experimental and behavioural economics) and decisions are made in real economies. In the hands of Lucas et consortes REH has been transformed from an – in principle – testable hypothesis to an irrefutable proposition.

 

Rational expectations, the future and the end of history

REH basically assumes that all learning has already taken place. This is extremely difficult to vision tin reality, because that means that history has come to an end. When did that happen? It is indeed a remarkable assumption, since in our daily lives, most of us experience a continuing learning. It may be a tractable assumption, yes. But helpful to understand real-world economies? I’ll be dipped! REH models are not useful as-if representations of real-world target systems.

REH builds on Savage’s (1954) “sure thing principle,” according to which people never make systematic mistakes. They may “tremble” now and then, but on average, they always make the right, the rational, decision.

In REH agents know all possible outcomes. In reality, many of those outcomes are yet to be originated. The future is not about known probability distributions. It is not about picking the right ball from an urn. It is about new possibilities. It is about inventing new balls and new urns to put them in. If so, even if we learn, uncertainty does not go away. As G L S Shackle (1972:102) argued, the future

waits, not for its contents to be discovered, but for that content to be originated.

As shown already by Davidson (1983), REH implies that relevant distributions have to be time independent (which follows from the ergodicity implied by REH). But this amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes (cf my critique of probabilistic econometrics in the tradition of Haavelmo in Pålsson Syll (2010)). An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time.

In REH we are never disappointed in any other way than as when we lose at the roulette wheels, since “averages of expectations are accurate” (Muth 1961:316). But real life is not an urn or a roulette wheel, so REH is a vastly misleading analogy of real-world situations. It may be a useful assumption – but only for non-crucial and non-important decisions that are possible to replicate perfectly (a throw of dices, a spin of the roulette wheel etc).

 

REH and modeling aspirations of Nirvana

REH comes from the belief that to be scientific, economics has to be able to model individuals and markets in a stochastic-deterministic way. It’s like treating individuals and markets as the celestial bodies studied by astronomers with the help of gravitational laws. Unfortunately, individuals, markets and entire economies are not planets moving in predetermined orbits in the sky.

To deliver REH has to constrain expectations on the individual and the aggregate level to be the same. If revisions of expectations take place in the REH models they typically have to take in a known and pre-specified precise way. This squares badly with what we know to be true in real world, where fully specified trajectories of future expectations revisions are no-existent.

Most REH models are time-invariant and so give no room for any changes in expectations and their revisions. The only imperfection of knowledge they admit of is included in the error terms, error terms that are assumed to be additive and to have a give and known frequency distribution, so that the REH models can still fully pre-specify the future even when incorporating these stochastic variables into the models.

 

Aggregation and representative actors models

In the real world there are many different expectations and these cannot be aggregated in REH models without giving rise to inconsistency (acknowledged by Lucas (1995:225) himself). This is one of the main reasons for REH models being modeled as representative actors models. But this is far from being a harmless approximation to reality (cf Pålsson Syll (2010)). Even the smallest differences of expectations between agents would make REH models inconsistent, so when they still show up they have to be considered “irrational”.

It is not possible to adequately represent individuals and markets as having one single overarching probability distribution. Accepting that does not imply – as Levine seems to think – that we have to end all theoretical endeavours and assume that all agents always act totally irrationally and only are analyzable within behavioural economics. Far from it. It means we acknowledge diversity and imperfection, and that economic theory has to be able to incorporate these empirical facts in its models.

Incompatibility between actual behavior and REH behavior is not a symptom of “irrationality”. It rather shows the futility of trying to represent real-world target systems with models flagrantly at odds with reality.

 

Conclusion

Levine maintains that “the only robust policies and institutions – ones that we may hope to withstand the test of time – are those based on rational expectations – those that once understood will continue to function.” As I hope I have been able to show, there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place the rational expectations hypothesis where it belongs – in the dustbin of history.

Interestingly enough, the main developer of REH himself, Robert Lucas – in an interview with Kevin Hoover (http://econ.duke.edu/~kdh9/) – has himself had second-thoughts on the validity of REH:

Kevin Hoover: The Great Recession and the recent financial crisis have been widely viewed in both popular and professional commentary as a challenge to rational expectations and to efficient markets … I’m asking you whether you accept any of the blame … there’s been a lot of talk about whether rational expectations and the efficient-markets hypotheses is where we should locate the analytical problems that made us blind.

Robert Lucas: You know, people had no trouble having financial meltdowns in their economies before all this stuff we’ve been talking about came on board. We didn’t help, though; there’s no question about that. We may have focused attention on the wrong things, I don’t know.

I’m looking forward to see some future second-thoughts on the subject from Levine too. Better late than never.

 

References

Clower, Robert (1989), The State of Economics: Hopeless but not Serious, in The Spread of Economic Ideas, eds. D Colander and A W Coats, Cambridge University Press.

Davidson, Paul (1983), Rational expectations: a fallacious foundation for studying crucial decision-making processes. Journal of Post Keynesian Economics 5:182-198.

Haavelmo, Trygve (1944), The probability approach in econometrics. Supplement to Econometrica 12:1-115.

Frydman, Roman and Michael Goldberg (2007), Imperfect Knowledge Economics, Princeton: Princeton University Press.

Hicks, John (1979), Causality in Economics, New York: Basic Books.

Hoover, Kevin (1988), The New Classical Macroeconomics. Oxford: Basil Blackwell.

Keynes, John Maynard (1937), The General Theory of Employment. Quarterly Journal of Economics 51:209-23.

– (1964 (1936)), The General Theory of Employment, Interest, and Money. London: Harcourt Brace Jovanovich.

– (1973 (1921)), A Treatise on Probability. Volume VIII of The Collected Writings of John Maynard Keynes, London: Macmillan.

Knight, Frank (1921), Risk, Uncertainty and Profit, Boston: Houghton Mifflin.

Krugman, Paul (2000), How complicated does the model have to be? Oxford Review of Economic Policy 16:33-42.

– (2009), How Did Economists get It So Wrong? The New York Times September 6.

Lucas, Robert (1972), Expectations and the Neutrality of Money, Journal of Economic Theory 4:103-124.

– (1981), Studies in Business-Cycle Theory. Oxford: Basil Blackwell.

– (1995), The Monetary Neutrality, The Nobel Lecture, Stockholm: The Nobel Foundation.

Muth, John (1961), Rational expectations and the theory of price movements, Econometrica 29:315-335.

Prescott. Edward (1977), Should Control Theory be Used for Economic Stabilization?, in K Brunner and A H Meltzer (eds) Optimal Policies, Control Theory and Technology Exports, Carnegie-Rochester Conference Series on Public Policy, volume 7, Amsterdam: North Holland.

Pålsson Syll, Lars (2007), John Maynard Keynes. Stockholm: SNS Förlag.

– (2010) What is (wrong with) economic theory? http://www.paecon.net/PAEReview/issue55/Syll55.pdf

Savage, L J (1954): The Foundations of Statistics. John Wiley and Sons, New York.

Shackle, G L S (1972), Epistemics & Economics: A Critique of Economic Doctrines. Cambridge: Cambridge University Press.

German Finance Minister supports Greek default

14 February, 2012 at 08:38 | Posted in Economics, Politics & Society | Comments Off on German Finance Minister supports Greek default

In an interview with Welt am Sonntag, German Finance Minister Wolfgang Schäuble this weekend made it quite clear that he supports a Greek default: 

Wolfgang Schäuble: Die Griechen sind ein Sonderfall. Die portugiesische Regierung macht einen ordentlichen Job. Die haben das Problem, dass sie mehr Wachstum brauchen. Es ist also gut, dass Sie Ihre Hemden dort herstellen lassen. Damit helfen Sie, die Wirtschaft dort anzukurbeln.

Herbert Seckler: Mich würde es ja freuen, wenn Griechenland die große Ausnahme in Südeuropa wäre. Ihr Wort in Gottes Ohr.

Wolfgang Schäuble: Griechenland wird in der einen oder anderen Form aufgefangen. Aber das Land muss seine Hausaufgaben machen, um wettbewerbsfähig zu werden. Ob das im Rahmen eines neuen Hilfsprogramms geschieht, oder auf einem anderen Weg, den wir eigentlich nicht wollen …

Herbert Seckler: Was meinen Sie damit? Den Austritt aus dem Euro?

Wolfgang Schäuble: Das haben die Griechen alles selber in der Hand. Aber selbst für den Fall, von dem die allermeisten nicht ausgehen, blieben sie in Europa. Doch diesen Weg wollen wir vermeiden. Wir helfen gerne, aber wir sollen anderen nicht das Gefühl geben, sie müssten sich nicht anstrengen. Jeder ist auch für sich verantwortlich.

It will be an interesting euro-spring this year. The German Finance Minister Schäuble supports the bankruptcy of Greece and German Chancellor Angela Merkel is just as adamant in tryig avoid it. Maybe they should have a talk with each other?

Annie Lööfs nyliberalism sänker centern

13 February, 2012 at 19:48 | Posted in Politics & Society | 1 Comment

.

.

.

.

Skulle vi gå till val idag skulle Centerpartiet hamna utanför riksdagen. I den senaste mätningen från United Minds får partiet bara 3,9 procent. Det är den lägsta siffran på flera år.

Även om Annie Lööf inte är ensam ansvarig för centerns kräftgång, är det svårt att inte tolka siffrorna som att väljarna ger underkänt till Stureplansmaffian och dess fullföljande av Maud Olofssons omdaning av centerpartiet till ett nyliberalt högerparti.

Annie Lööf-effekten har uteblivit. Och det med rätta. Ska centerna ha en chans att sitta kvar i riksdagen är det hög tid att man vänder skutan tillbaka mot centerns ursprung som ett landsbygds- och miljöparti. Nyliberal dumdryghet med idoler och förebilder som Ayn Rand och Margaret Thatcher ger inga röster i 2000-talets Sverige.

Bostadsbubblan

12 February, 2012 at 13:45 | Posted in Economics | 1 Comment

I de rapporter om svensk bostadsmarknad som Riksbanken under det senaste året presenterat målas det upp en bild av att uppgången i de svenska huspriserna skulle kunna förklaras av “fundamenta” av typen ökade disponibla inkomster, lågt bostadsbyggande och låga kapitalkostnader. Om man tittar på diagrammet nedan har man minst sagt svårt att tro att så skulle vara fallet. Och glöm inte att när motsvarande analyser gjordes i USA kring åren 2005-2006 påstods ungefär samma sak. Och sen vet vi ju hur det gick …

             Källa: SCB och egna beräkningar

Gatsbykurvan och hästskitsteoremet

12 February, 2012 at 10:14 | Posted in Economics, Politics & Society | 4 Comments

I den ekonomisk-politiska debatten hör man ofta marknadsfundamentalismens förespråkare säga att ojämlikhet inte är något problem. Anledningen sägs i huvudsak vara två.

Pro primo –  hästskitsteoremet – enligt vilket sänkta skatter och ökad välfärd för de rika ändå så småningom sipprar ner till de fattiga. Göd hästen och fåglarna kan äta sig mätta på spillningen. 

Pro secondo – att så länge alla har samma chans att bli rika är ojämlikheten oproblematisk.

Hästskitsteoremet (“trickle-down effect”) visade omfattande forskning redan under Thatcher-Reagan-eran hörde mytvärlden till.

Och nu har Alan Krueger – ekonomiprofessor vid Princeton-universitetet – med sin Gatsbykurva visat att även det andra försöket till försvar av ojämlikhet hör hemma i sagornas värld:

[På den vertikala axeln visas hur mycket en enprocentig ökning i din fars inkomster påverkar dina förväntade inkomster (ju högre tal, desto lägre förväntad social rörlighet), och på den horisontella axeln visas Ginikoefficienten, som mäter ojämlikhet (ju högre tal, desto högre ojämlikhet)]

Tydligare än så här går det knappt att se att jämlika länder också är de med störst social rörlighet – och att det därför börjar bli dags att ta itu med de ökade inkomst- och förmögenhetsklyftorna. Så även i Sverige, där nyreviderade data från SCB visar hur utvecklingen av disponibel inkomst per konsumtionsenhet (exklusive kapitalvinst efter deciler, samtliga personer 1995-2010, medelvärden i tusen kr per k.e. i 2010 års priser) de senaste åren har sett ut:

               Källa: SCB och egna beräkningar

Och än värre är det om man tittar på förmögenhetsutvecklingen. Undrar just vilken roll sänkta skatter för de som tjänar mest och minskade ersättningar för låginkomsttagare, sjuka och arbetslösa spelat …

Ibland säger två grafer mer än tusen ord.

Carl B Hamilton har fel om Finland och Sverige

11 February, 2012 at 21:30 | Posted in Economics | Comments Off on Carl B Hamilton har fel om Finland och Sverige

Folkpartiets ekonomiska talesperson Carl Bastiat Hamilton argumenterar som bekant för att – trots dagens monumentala ekonomiska problem inom euroområdet – Sverige bör gå med i eurosamarbetet.

Överlag måste man tyvärr resa frågetecken kring hela den hamiltonska argumentationen (vilket yours truly haft anledning återkomma till vid ett flertal tillfällen under det gångna året). Ett exempel får belysa de alltför uppenbara bristerna.

Hamilton jämför ofta Sverige och Finland och brukar konstatera att trots att det ena landet varit med i eurosamarbetet och det andra inte, så har de under de senaste tio åren ekonomiskt utvecklats på ungefär samma sätt. Av detta drar han sedan slutsatsen att skuldproblemen i Europa “alltså inte beror på asymmetriska chocker eller ett ekonomiskt inoptimalt valutaområde.” Här undrar man ju så klart hur det står till med både logik och vetenskapsteoretisk stringens. Hur kan man utifrån ett sampel av två länders ekonomiska utveckling dra så vida slutsatser för hela populationen av euroländer? Självklart går detta inte alls och därför hänger också slutsatsen fullständigt i luften.

Därtill kommer att Hamilton också har fel vad avser den faktiska utvecklingen i Finland och Sverige. En som uppenbarligen har betydligt bättre koll på läget än Hamilton är  Göran Zettergren på utredarna. nu:

Sverige och Finland är relativt likartade länder vars makroekonomiska utveckling i hög grad följts åt sedan slutet av 1990-talet. Men under de senaste åren har någonting skett och ländernas utveckling har glidit isär. Sedan 2007 har Sveriges BNP ökat med drygt 4 procent mer än Finlands. Svensk sysselsättning har ökat mer än 3 procent mer än den finska. Dels var den finska nedgången djupare men framför allt var återhämtningen betydligt långsammare. Den svenska sysselsättningen var redan tillbaka på förkrisnivå för ett år sedan. Den finska sysselsättningen lär inte nå upp till denna nivå förrän någon gång under nästa högkonjunktur.

En viktig förklaring är naturligtvis växelkursen. Kronan försvagades med nästan 20 procent mot euron i början av krisen. Detta mildrade dess genomslag i Sverige och räddade en hel del svenska jobb. På bekostnad av Finland och andra euroländer, bör man kanske tillägga.

En annan viktig faktor är att Sverige – genom att behålla sin egna valuta – lyckats hålla sig utanför den finansiella krisen i eurozonen. Svenska och finska långräntor har i huvudsak följts åt sedan 1999 och de skillnader som uppstått har i huvudsak förklarats av olika nivåer på styrräntorna. Det senaste året har dock de svenska långräntorna minskat jämfört med de finska samtidigt som de svenska styrräntan varit betydligt högre än eurozonens.  I januari var den svenska 10-årsräntan 0,6 procentenheter lägre än den finska samtidigt som den svenska styrräntan var 0,75 högre. Man bör kanske tillägga att Sverige och Finland, tillsammans med Luxemburg, är de enda EU-länder som i dagsläget uppfyller både Stabilitets- och tillväxtpaktens och den nya Europaktens regler vad beträffar offentliga finanser.

Det förefaller därför som om Finland får betala ett visst samhällsekonomisk pris för sitt innanförskap.

“Ser bra ut” på bomarknaden? Så fan heller!

11 February, 2012 at 10:29 | Posted in Economics, Politics & Society | 20 Comments

Enligt färsk statistik  har svenskarna ökat sin skuldsättning under de senaste tre åren med mer än 400 miljarder kronor. Bolånevolymen har under samma period ökat med i snitt 32% (i Stockholm med 38%). Sverige är därmed ett av de länder i OECD-området där bolånetillväxten varit som störst sedan år 2008.

Varje sansad bedömare inser att detta är ett problem som måste lösas innan bostadsbubblan spricker. I annat fall är det hög risk för att låt-gå-politiken kommer att gå igen som svinhugg – och då är det de arbetslösa, bostadslösa och skuldsatta som får ta smällarna – som vanligt när den nyliberala lekstugan tar slut!

Men på Finansinspektionen fortsätter man som vanligt att titta åt andra hållet. “Det ser bra ut för genomsnittssvensken” påstår myndighetens chefsekonom i en intervju i dagens Svenska Dagbladet.  Man tar sig för pannan!

Realprisutvecklingen för hus sedan 1986 ser ut så här:

                              Källa: SCB och egna beräkningar

Reala huspriser (KPI-deflaterade nominalpriser) har för tidsspannet i min graf fördubblats! “Bra ut för genomsnittssvensken” I’ll be dipped!

Sverige har sedan åtminstone mitten på 1990-talet haft en klart stigande trend när det gäller tillgångspriser. Det syns också i den internationellt sett extremt höga skuldsättningen i hushållssektor:

                    Källa: SCB och egna beräkningar

Hushållens skuldsättning bottnar främst i den ökning av tillgångsvärden som letts av ökad långivning till hushållen och den därav uppkomna bostadsbubblan. På lång sikt är det självklart inte möjligt att bibehålla denna trend. Tillgångspriserna  avspeglar i grunden förväntningar om framtida avkastning på investeringar. Om tillgångspriserna fortsätter öka snabbare än inkomsterna blir effekten ökad inflation med vidhängande nedjustering av tillgångarnas realvärde.

Med den skuldkvot vi ser hushållen tagit på sig – med finanssektorns goda minne och starkt kopplad till en bostadsbubbla som varken regering, riksbank eller finansinspektionen riktigt velat ta på allvar – riskerar den skulddeflationskris som regeringen med sin passivitet nu bäddar för att slå oerhört hårt mot svenska hushåll.

Den traditionella ekonomiska teorin antar att människan är rationella nyttomaximerande egoister. Verkligheten är en annan. Om något så visar alla dessa tulpan-, subprime- och bostadsbubblor hur irrationellt och styrt av psykologiska impulser – animal spirits för att tala med Keynes – som vårt beteende egentligen är.

Drömmen om ett eget boende kan mycket väl visa sig bli en mardröm. Men då har väl – som vanligt – både mäklare, banker och alla “ansvarsfulla” politiker och myndighetschefer hoppat av karusellen och svurit sig fria från allt ansvar!

Wall Street Journal on the devastating effects of austerity measures in Greece

10 February, 2012 at 16:53 | Posted in Economics, Politics & Society | Comments Off on Wall Street Journal on the devastating effects of austerity measures in Greece

The Wall Street Journal reports today on the state of the Greek economy:

The Greek economy continued to show signs of erosion under the pressure of government austerity toward the end of last year, marked by an accelerating rise in unemployment and a deepening slump in industrial production.

Greece’s unemployment rate soared in November to 20.9% compared with an 18.2% rate just a month earlier and up sharply from one year ago. The total number of unemployed reached 1.029 million, up by 126,062 from October, the Hellenic Statistical Authority, or Elstat, reported Thursday.

Greece also reported that industrial output fell 11.3% in December compared with the year-earlier period, after declining by 7.8% in November. Austerity measures introduced last year as part of a €110 billion ($145.87 billion) bailout plan have taken a heavy toll on Greek economic activity, weighing on consumption and investments and leading to Greece’s fifth year of economic recession in 2012.

GREEKECON

Greece is now in the fifth year of a recession that has led to soaring unemployment and rising business bankruptcies, made worse by tough austerity measures aimed at narrowing the government budget gap. Compared with a year earlier, Greece’s unemployment situation has deteriorated sharply.

In November 2010, the unemployment rate was just 13.9% and the number of jobless at 692,577. In its 2012 budget, the Greek government estimates that unemployment will have averaged 15.4% in 2011 and rise to 17.1% this year.

According to the Elstat data, young people remain the hardest hit by Greece’s deepening recession. A staggering 48% of those aged between 15 and 24 were without a job in November, a sharp increase from the 35.6% rate recorded a year earlier.

Next Page »

Blog at WordPress.com.
Entries and comments feeds.