Statistical vs. Practical Significance

10 februari, 2013 kl. 11:22 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för Statistical vs. Practical Significance



Awesome Swedish Winter’s Tale

9 februari, 2013 kl. 17:31 | Publicerat i Varia | 2 kommentarer

Absolutely fabulous video by Ted Ström.

Why Friedman’s methodology has ”jumped the shark”

9 februari, 2013 kl. 11:02 | Publicerat i Economics, Theory of Science & Methodology | 2 kommentarer

fonzsharkjump-300x300The basic argument of [Milton Friedman’s infamous 1953 essay ‘The Methodology of Positive Economics’] is the unrealism of a theory’s assumptions should not matter; what matters are the predictions made by the theory. A truly realistic economic theory would have to incorporate so many aspects of humanity that it would be impractical or computationally impossible to do so. Hence, we must make simplifications, and cross check the models against the evidence to see if we are close enough to the truth. The internal details of the models, as long as they are consistent, are of little importance.

The essay, or some variant of it, is a fallback for economists when questioned about the assumptions of their models. Even though most economists would not endorse a strong interpretation of Friedman’s essay, I often come across the defence ’it’s just an abstraction, all models are wrong’ if I question, say, perfect competition, utility, or equilibrium. I summarise the arguments against Friedman’s position below.

The first problem with Friedman’s stance is that it requires a rigorous, empirically driven methodology that is willing to abandon theories as soon as they are shown to be inaccurate enough. Is this really possible in economics? …

The second problem with a ‘pure prediction’ approach to modelling is that, at any time, different theories or systems might exhibit the same behaviour, despite different underlying mechanics. That is: two different models might make the same predictions, and Friedman’s methodology has no way of dealing with this …

The third problem is the one I initially honed in on: the vagueness of Friedman’s definition of ‘assumptions,’ and how this compares to those used in science …

The fourth problem is related to above: Friedman is misunderstanding the purpose of science. The task of science is not merely to create a ‘black box’ that gives rise to a set of predictions, but to explain phenomena: how they arise; what role each component of a system fills; how these components interact with each other …

The fifth problem is one that is specific to social sciences, one that I touched on recently: different institutional contexts can mean economies behave differently. Without an understanding of this context, and whether it matches up with the mechanics of our models, we cannot know if the model applies or not. Just because a model has proven useful in one situation or location, it doesn’t guarantee that it will useful elsewhere, as institutional differences might render it obsolete.

The final problem, less general but important, is that certain assumptions can preclude the study of certain areas. If I suggested a model of planetary collision that had one planet, you would rightly reject the model outright. Similarly, in a world with perfect information, the function of many services that rely on knowledge – data entry, lawyers and financial advisors, for example – is nullified …

Friedman’s essay has economists occupying a strange methodological purgatory, where they seem unreceptive to both internal critiques of their theories, and their testable predictions. This follows directly from Friedman’s ambiguous position. My position, on the other hand, is that the use and abuse of assumptions is always something of a judgment call. Part of learning how to develop, inform and reject theories is having an eye for when your model, or another’s, has done the scientific equivalent of jumping the shark.

Unlearning Economics

On the irreversibility of time and economics

8 februari, 2013 kl. 19:24 | Publicerat i Economics, Statistics & Econometrics | 2 kommentarer


As yours truly has argued – e.g. here, here and here – this is an extremely important issue for everyone wanting to understand what are the deep fundamental flaws of mainstream neoclassical economics.

Added 9/2: And as an almost immediate testimony to how wrong things may go when you do not understand the importance of making the distinction between real, non-ergodic, time averages and unreal hypothetical, ergodic , ensemble averages,  Noah Smith yesterday posted a piece defending the Efficient Market Hypothesis (falsely intimating it being a popular target of critique only among ”lay critics of the econ profession”) basically referring  to the same Paul Samuelson that Ole Peters so rightly criticises in his lecture.

On probabilism and statistics

8 februari, 2013 kl. 12:45 | Publicerat i Statistics & Econometrics | 3 kommentarer

‘Mr Brown has exactly two children. At least one of them is a boy. What is the probability that the other is a girl?’ What could be simpler than that? After all, the other child either is or is not a girl. I regularly use this example on the statistics courses I give to life scientistsworking in the pharmaceutical industry. They all agree that the probability is one-half.

dicing-with-death-chance-risk-and-healthSo they are all wrong. I haven’t said that the older child is a boy.The child I mentioned, the boy, could be the older or the younger child. This means that Mr Brown can have one of three possible combinations of two children: both boys, elder boy and younger girl, elder girl and younger boy, the fourth combination of two girls being excluded by what I have stated. But of the three combinations, in two cases the other child is a girl so that the requisite probability is 2/3 …

This example is typical of many simple paradoxes in probability: the answer is easy to explain but nobody believes the explanation. However, the solution I have given is correct.

Or is it? That was spoken like a probabilist. A probabilist is a sort of mathematician. He or she deals with artificial examples and logical connections but feel no obligation to say anything about the real world. My demonstration, however, relied on the assumption that the three combinations boy–boy, boy–girl and girl–boy are equally likely and this may not be true. The difference between a statistician and a probabilist is that the latter will define the problem so that this is true, whereas the former will consider whether it is true and obtain data to test its truth.

Models in economics

8 februari, 2013 kl. 09:45 | Publicerat i Economics | Kommentarer inaktiverade för Models in economics

Remember that a model is not the truth. It is a lie to help you get your point across. And in the case of modeling economic risk, your model is a lie about others, who are probably lying themselves. And what’s worse than a simple lie? A complicated lie.

Sam L. Savage, The Flaw of Averages

Gunilla von Bahr (1941-2013)

7 februari, 2013 kl. 17:37 | Publicerat i Varia | Kommentarer inaktiverade för Gunilla von Bahr (1941-2013)


On inflation targeting and rational expectations

6 februari, 2013 kl. 17:50 | Publicerat i Economics | 4 kommentarer


The Riksbank in 1993 announced an official target for CPI inflation of 2 percent. Over the last 15 years, average CPI inflation has equaled 1.4 percent and has thus fallen short of the target by 0.6 percentage points. Has this undershooting of the inflation target had any costs in terms of higher average unemployment? This depends on whether the long-run Phillips curve in Sweden is vertical or not. During the last 15 years, inflation expectations in Sweden have become anchored to the inflation target in the sense that average inflation expectations have been close to the target. The inflation target has thus become credible. If inflation expectations are anchored to the target also when average inflation deviates from the target, the long-run Phillips curve is no longer vertical but downward-sloping. Then average inflation below the credible target means that average unemployment is higher than the rationalexpectations steady-state (RESS) unemployment rate. The data indicate that the average unemployment rate has been 0.8 percentage points higher than the RESS rate over the last 15 years. This is a large unemployment cost of undershooting the inflation target. Some simple robustness tests indicate that the estimate of the unemployment cost is rather robust, but the estimate is preliminary and further scrutiny is needed to assess its robustness.

During 1997-2011, average CPI inflation has fallen short of the inflation target of 2 percent by 0.6 percentage points. But average inflation expectations according to the TNS Sifo Prospera survey have been close to the target. Thus, average inflation expectations have been anchored to the target and the target has become credible. If average inflation expectations are anchored to the target when average inflation differ from the target, the long-run Phillips curve is not vertical. Then lower average inflation means higher average unemployment. The data indicate that average inflation below target has been associated with average unemployment being 0.8 percentage points higher over the last 15 years than would have been the case if average inflation had been equal to the target. This is a large unemployment cost of average inflation below a credible target. Some simple robustness tests indicate that the estimate of the unemployment cost is rather robust, but the estimate is preliminary and further scrutiny is needed to assess its robustness.

The difference between average inflation and average inflation expectations and the apparent existence of a downward-sloping long-run Phillips curve raises several urgent questions that I believe need to be addressed. Why have average inflation expectations exceeded average inflation for 15 years? Why has average inflation fallen below the target for 15 years? Could average inflation have fallen below average inflation expectations and the inflation target without the large unemployment cost estimated here? Could the large unemployment cost have been avoided with a different monetary policy? What are the policy implications for the future? Do these findings make price-level targeting or the targeting of average inflation over a longer period relatively more attractive, since they would better ensure that average inflation over longer periods equals the target?

Lars E.O. Svensson, The Possible Unemployment Cost of Average Inflation below a Credible Target

According to Lars E. O. Svensson  – deputy governor of the Riksbank – the Swedish Riksbank has been pursuing a policy during the years 1998-2011 that in reality has made inflation on average 0.6 percentage units lower than the goal set by the Riksbank. The Phillips Curve he estimates shows that unemployment as a result of this overly “austere” inflation level has been almost 1% higher than if one had stuck to the set inflation goal of 2%.

What Svensson is saying, without so many words, is that the Swedish Fed for no reason at all has made people unemployed. As a consequence of a faulty monetary policy the unemployment is considerably higher than it would have been if the Swedish Fed had done its job adequately.

From a more methodological point of view it is of course also interesting to consider the use made of the rational expectations hypothesis in these model-based calculations (and models of the same ilk that abounds in ”modern” macroeconmics). When data tells us that ”average inflation expectations exceeded average inflation for 15 years” – wouldn’t it be high time to put the REH where it belongs – in the dustbin of history!

To me Svensson’s paper basically confirms what I wrote a couple of months ago:

Models based on REH impute beliefs to the agents that is not based on any real informational considerations, but simply stipulated to make the models mathematically-statistically tractable.

Of course you can make assumptions based on tractability, but then you do also have to take into account the necessary trade-off in terms of the ability to make relevant and valid statements on the intended target system.

Mathematical tractability cannot be the ultimate arbiter in science when it comes to modeling real world target systems. Of course, one could perhaps accept REH if it had produced lots of verified predictions and good explanations. But it has done nothing of the kind. Therefore the burden of proof is on those who still want to use models built on ridiculously unreal assumptions – models devoid of all empirical interest.

In reality, REH is a rather harmful modeling assumption, since it contributes to perpetuating the ongoing transformation of economics into a kind of science-fiction-economics. If economics is to guide us, help us make forecasts, explain or better understand real world phenomena, it is in fact next to worthless.

On the non-equivalence of Keynesian and Knightian uncertainty (wonkish)

5 februari, 2013 kl. 22:30 | Publicerat i Economics, Theory of Science & Methodology | 2 kommentarer

Last year Bank of England’s Andrew G Haldane and Benjamin Nelson  presented a paper with the title Tails of the unexpected. The main message of the paper was that we should no let us be fooled by randomness:

For almost a century, the world of economics and finance has been dominated by randomness. Much of modern economic theory describes behaviour by a random walk, whether financial behaviour such as asset prices (Cochrane (2001)) or economic behaviour such as consumption (Hall (1978)). Much of modern econometric theory is likewise underpinned by the assumption of randomness in variables and estimated error terms (Hayashi (2000)).

But as Nassim Taleb reminded us, it is possible to be Fooled by Randomness (Taleb (2001)). For Taleb, the origin of this mistake was the ubiquity in economics and finance of a particular way of describing the distribution of possible real world outcomes. For non-nerds, this distribution is often called the bell-curve. For nerds, it is the normal distribution. For nerds who like to show-off, the distribution is Gaussian.

The normal distribution provides a beguilingly simple description of the world. Outcomes lie symmetrically around the mean, with a probability that steadily decays. It is well-known that repeated games of chance deliver random outcomes in line with this distribution: tosses of a fair coin, sampling of coloured balls from a jam-jar, bets on a lottery number, games of paper/scissors/stone. Or have you been fooled by randomness?

In 2005, Takashi Hashiyama faced a dilemma. As CEO of Japanese electronics corporation Maspro Denkoh, he was selling the company’s collection of Impressionist paintings, including pieces by Cézanne and van Gogh. But he was undecided between the two leading houses vying to host the auction, Christie’s and Sotheby’s. He left the decision to chance: the two houses would engage in a winner-takes-all game of paper/scissors/stone.

Recognising it as a game of chance, Sotheby’s randomly played “paper”. Christie’s took a different tack. They employed two strategic game-theorists – the 11-year old twin daughters of their international director Nicholas Maclean. The girls played “scissors”. This was no random choice. Knowing “stone” was the most obvious move, the girls expected their opponents to play “paper”. “Scissors” earned Christie’s millions of dollars in commission.

As the girls recognised, paper/scissors/stone is no game of chance. Played repeatedly, its outcomes are far from normal. That is why many hundreds of complex algorithms have been developed by nerds (who like to show off) over the past twenty years. They aim to capture regularities in strategic decision-making, just like the twins. It is why, since 2002, there has been an annual international world championship organised by the World Rock-Paper-Scissors Society.

The interactions which generate non-normalities in children’s games repeat themselves in real world systems – natural, social, economic, financial. Where there is interaction, there is non-normality. But risks in real-world systems are no game. They can wreak havoc, from earthquakes and power outages, to depressions and financial crises. Failing to recognise those tail events – being fooled by randomness – risks catastrophic policy error.

So is economics and finance being fooled by randomness? And if so, how did that happen?

Normality has been an accepted wisdom in economics and finance for a century or more. Yet in real-world systems, nothing could be less normal than normality. Tails should not be unexpected, for they are the rule. As the world becomes increasingly integrated – financially, economically, socially – interactions among the moving parts may make for potentially fatter tails. Catastrophe risk may be on the rise.

If public policy treats economic and financial systems as though they behave like a lottery – random, normal – then public policy risks itself becoming a lottery. Preventing public policy catastrophe requires that we better understand and plot the contours of systemic risk, fat tails and all. It also means putting in place robust fail-safes to stop chaos emerging, the sand pile collapsing, the forest fire spreading. Until then, normal service is unlikely to resume.

Since I think this is a great paper, I think it merits  a couple of  comments.

To understand real world ”non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past are not those that will rule the future.

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

When you assume the economic processes to be ergodic, ensemble and time averages are identical. Let me give an example: Assume we have a market with an asset priced at 100 €. Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be 100 €- because we here envision two parallel universes (markets) where the asset-price falls in one universe (market) with 50% to 50 €, and in another universe (market) it goes up with 50% to 150 €, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset-price first rises by 50% to 150 €, and then falls by 50% to 75 € (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen.

Assuming ergodicity there would have been no difference at all. What is important with the fact that real social and economic processes are nonergodic is the fact that uncertainty – not risk – rules the roost. That was something both Keynes and Knight basically said in their 1921 books. Thinking about uncertainty in terms of “rational expectations” and “ensemble averages” has had seriously bad repercussions on the financial system.

Knight’s uncertainty concept has an epistemological founding and Keynes’s definitely an ontological founding. Of course this also has repercussions on the issue of ergodicity in a strict methodological and mathematical-statistical sense. I think Keynes’s view is the most warranted of the two.

The most interesting and far-reaching difference between the epistemological and the ontological view is that if one subscribes to the former, Knightian view – as Taleb, Haldane & Nelson and “Black Swan” theorists basically do – you open up for the mistaken belief that with better information and greater computer-power we somehow should always be able to calculate probabilities and describe the world as an ergodic universe. As Keynes convincingly argued, that is ontologically just not possible.

To Keynes the source of uncertainty was in the nature of the real – nonergodic – world. It had to do, not only – or primarily – with the epistemological fact of us not knowing the things that today are unknown, but rather with the much deeper and far-reaching ontological fact that there often is no firm basis on which we can form quantifiable probabilites and expectations at all.

Keynes-Tobin Tax – steg i rätt riktning

5 februari, 2013 kl. 15:15 | Publicerat i Economics, Politics & Society | 3 kommentarer

tobintaxI finanskrisens spår har allt fler ledande politiker inom EU åter börjat ställa krav på införande av en Keynes-Tobin-skatt på finansiella transaktioner.

Europas kärnländer har nu bestämt sig för att införa en skatt på finansiella transaktioner och kommissionen och Europaparlamentet har godkänt förslaget. Det sista hindret i form av ett veto från någon av de EU-länder som inte vill vara med (till exempel Storbritannien och Sverige) avvärjdes på de europeiska finansministrarnas möte för två veckor sedan.

Många etablissemangsekonomer har hävdat att det är en ”extremt missriktad åtgärd” att införa en sådan skatt. Det är nästan ingen hejd på allt negativt en Keynes-Tobin-skatt skulle leda till: minskade investeringar, lägre produktion, minskad handelsvolym och sjunkande löner.

Åsiktet av det slaget belyser på ett nästan övertydligt sätt det kanske mest paradoxala med finanskriser – att många ekonomer ingenting verkar vilja lära. En nutidshistorisk tillbakablick bjuder annars på lärorika erfarenheter.

I början av år 2000 började det stå allt klarare att de omfattande avregleringar på den finansiella marknaden som ägt rum sedan Thatcher-Reagan-eran medfört en alltför snabb kreditexpansion. Bankernas och finansbolagens utlåning ökade lavinartat och satsningen på att vinna allt större marknadsandelar gjorde att kreditvärdighetskontrollen eftersattes och dåliga kundförbindelser kom att accepteras. Framför allt IT-aktiernas värden var oproportionerligt höga. Det skulle leda till en oundviklig finansiell kris. Så blev det också. Bubblan sprack och finansmarknadskrisen var ett faktum.

Eftersom människans minne är kort kunde samma mekanismer åter skapa en finansiell härdsmälta år 2008. Den kris världsekonomin fortfarande befinner sig i efterdyningarna av, har denna gång inte sitt ursprung i IT-aktier, utan i den spekulativa bubbla som utvecklades på den amerikanska bolånemarknaden åren 1997 – 2006 och ett kostsamt ekonomiskt experiment i form av euron.

Det genomgående mönstret har varit detsamma i denna som i andra finanskriser. Av någon anledning uppstår en förskjutning (krig, innovationer, nya spelregler m m) i det ekonomiska kretsloppet som leder till förändringar i bankers och företags vinstmöjligheter. Efterfrågan och priser stiger och drar med sig allt fler delar av en ekonomi som hamnar i ett slags eufori. Fler och fler dras med och snart är spekulationsmanin – vare sig det gäller tulpanlökar, fastigheter eller bolån – ett faktum. Förr eller senare säljer någon för att ta ut sina vinster och en rusning efter likviditet sätter in. Det har blivit dags att hoppa av karusellen och lösa in sina värdepapper och andra tillgångar i reda pengar. Ett finansiellt nödläge uppstår och breder ut sig. Priserna börjar sjunka, konkurserna ökar och krisen påskyndas och går över i panik. För att hindra den slutliga kraschen dras krediten åt och man börjar ropa på en långivare som i sista hand kan garantera tillgången på de efterfrågade kontanta medlen och återupprätta förtroendet. Lyckas inte detta är kraschen ett faktum.

Finanskriser är ett oundvikligen återkommande inslag i en ekonomi med fritt spelrum för väsentligen oreglerade marknader. Finansiell beteendeteori har kunnat visa att bilden av investerare som rationella är svår att förena med fakta hämtade från verkliga finansmarknader. Investerare verkar handla mer med utgångspunkt från brus än information. De extrapolerar utifrån korta tidsförlopp, är känsliga för hur problem presenteras, är dåliga på att revidera sina riskbedömningar och ofta överkänsliga för humörsvängningar. Irrationalitet är inte irrelevant på finansmarknader.

Finansmarknader fungerar i grunden som ett slags informationscentral. Eftersom aktörerna inte har fullständig information tenderar de att skaffa sig information om ekonomins fundamenta genom att helt enkelt iaktta varandras handlande. Flockinstinkten får marknadens aktörer att tänka i takt och bidrar till att utvecklingen ofta får karaktären av att vara en biprodukt av kasinoaktiviteter. Med den nya tekniken och instrumenten kommer ett inbyggt tvång att handla snabbt, vilket ger ytterligare näring åt hjordbeteendet. Informationen hinner inte smältas utan ska genast omsättas i handling baserad på gissningar om hur ”de andra” kommer att reagera när informationen blir allmän. Detta leder i sin tur leder till ökad riskexponering.

Sedan avregleringen av finansmarknaderna tog fart på 1980-talet har bankernas kapitaltäckning – hur mycket kapital som bankerna måste hålla i reserv i förhållande till utlåningsvolymen – minskat från 80-90% till att idag ligga runt 50-60 %. Denna underkapitalisering bäddade självklart för den Ponziliknande spekulation som ledde fram till den senaste finanskrisen, där banker sålde tillgångar de inte ägde till spekulanter med pengar de inte hade till priser som de i slutändan inte kunde få ut.

Tyvärr har skuggbankernas expansion, överflyttande av låneverksamhet till bankernas regleringsbefriade investmentdivisioner, värdepapperisering och andra former av finansiella innovationer gjort mycket av den reglering som funnits kvar efter avregleringarna verkningslös. Trots att syftet med de nya instrumenten sades vara att minska och sprida risker har effekten snarare blivit en vildsint kreditgivning som ökat riskexponering och moralisk risk.

Vad kan man då göra för att minimera risken för framtida kriser? Finansmarknadens aktörer genererar uppenbarligen kostnader som de inte själva står för. När vår tids främste nationalekonom – John Maynard Keynes – efter börskraschen 1929 förespråkade införandet av en allmän finansmarknadsskatt var det för att han menade att marknaden då får stå för de kostnader som dess instabilitet, obalanser och störningar ger upphov till. De som i sin vinsthunger är beredda att ta onödiga risker och varaktigt skada samhällsekonomin måste själva vara med och betala notan.

James Tobin aktualiserade i början på 1970-talet Keynes idé genom att föreslå införandet av en skatt på valutatransaktioner. Målen för denna skatt är främst att reducera småskalig spekulation genom att automatiskt bestraffa kortsiktig spekulation men ha försumbara effekter på incitamenten för långsiktig handel och kapitalinvesteringar.

Finansmarknader har viktiga uppgifter att fylla i en ekonomi. Detta har marknadsförespråkare varit duktiga på att visa. Däremot har man varit sämre på att visa på de kostnader som de också ger upphov till.

Det fina med en Keynes-Tobin-skatt är att den skulle kyla av intresset för finansmarknadsoperatörer – och aktiehandlande robotar – att hålla på med kortsiktig spekulation, utan att ha annat än försumbara effekter på långsiktiga investeringsbeslut.

Dessa verksamheter har ju också visat sig vara av tvivelaktigt samhällsvärde – de konsumerar en stor del av våra gemensamma resurser i form av både mänsklig tankeverksamhet, talang, datakraft – och ger i slutändan inte mycket mer tillbaka än skulder och kriser. Som andra får betala. Att kasta grus i det maskineriet skulle antagligen bidra till bättre hushållning av samhällets resurspool.

Kritiker av Keynes-Tobin-skatten bygger det mesta av sina resonemang på risken för att Keynes-Tobin-skattens införande i EU bara skulle få bankverksamheten att flytta någon annanstans. Fast noga betänkt kan man ju säga det om alla förslag till finansmarknadsregleringar. Så varför just när det gäller den här skatten? Och varför hör vi aldrig argumentet när vi talar om bankernas courtageavgifter (som i flera länder är högre än den nu föreslagna skattesatsen från EU-kommissionen på 1 promille)? Eller Englands stämpelskatt på 0.5%? Och är man nu så rädd för att bankdirektörerna ska försöka undgå skatten kan man väl förslagsvis införa en incitamentskonform belöning på säg 5-10% av det staten skulle få in om bankanställda avslöjade chefers eventuella försök till ”dodging”.

Vi måste inse att vi inte både kan ha tårtan och äta den. Så länge vi har en ekonomi med väsentligen oreglerade finansiella marknader kommer vi också att få dras med återkommande kriser. Inte bara riksbankschefer inser nu för tiden att interventioner på finansmarknaden är försvarbara när prissättningen ”utvecklas i riktning från det fundamentalt motiverade.”

Jag är övertygad om att en Keynes-Tobin-skatt tillsammans med större öppenhet när det gäller behovet av bank-, kapital- och valutaregleringar kan bidra till att vi på sikt kan minska riskerna för finansiella systemrisker och kostsam finansiell instabilitet. Om vi däremot reflexmässigt vägrar se vidden av problemen kommer vi åter att stå handfallna när nästa kris tornar upp sig

En Keynes-Tobin-skatt löser långt ifrån alla problem på finansmarknaden. Men dess införande skulle sända en stark signal om hur vi ser på kasinoekonomins samhälleliga värde. För som Keynes skriver:

Spekulanterna kan vara oförargliga så länge de bara är bubblor på ytan av företagandets breda ström. Men läget blir allvarligt när företagandet blir en bubbla ovanpå en virvel av spekulation. När kapitalbildningen i ett land blir en biprodukt av versksamheten i ett kasino är det troligt att arbetet blir illa utfört … Införandet av en betydande omsättningsskatt på alla avslut skulle måhända visa sig vara den mest ändamålsenliga genomförbara reformen i syfte att … försvaga spekulationens övermakt gentemot företagandet.

Keynes on effective demand

4 februari, 2013 kl. 19:22 | Publicerat i Economics | Kommentarer inaktiverade för Keynes on effective demand

keynesThe idea that we can safely neglect the aggregate demand function is fundamental to [classical] economics … The complete-ness of the [classical] victory is something of a curiosity and a mystery. It must have been due to a complex of suitabilities in the doctrine to the environment into which it was projected. That it reached conclusions quite different from what the ordinary uninstructed person would expect, added, I suppose, to its intellectual prestige. That its teaching, translated into practice, was austere and often unpalatable, lent it virtue. That it was adapted to carry a vast and consistent logical superstructure, gave it beauty. That it could explain much social injustice and apparent cruelty as an inevitable incident in the scheme of progress, and the attempt to change such things as likely on the whole to do more harm than good, commended it to authority. That it afforded a measure of justification to the free activities of the individual capitalist, attracted to it the support of the dominant social force behind authority.

But although the doctrine itself has remained unquestioned by orthodox economists up to a late date, its signal failure for purposes of scientific prediction has greatly impaired, in the course of time, the prestige of its practitioners. For professional economists, after Malthus, were apparently unmoved by the lack of correspondence between the results of their theory and the facts of observation;—a discrepancy which the ordinary man has not failed to observe, with the result of his growing unwillingness to accord to economists that measure of respect which he gives to other groups of scientists whose theoretical results are confirmed by observation when they are applied to the facts.

The celebrated optimism of traditional economic theory, which has led to economists being looked upon as Candides, who, having left this world for the cultivation of their gardens, teach that all is for the best in the best of all possible worlds provided we will let well alone, is also to be traced, I think, to their having neglected to take account of the drag on prosperity which can be exercised by an insufficiency of effective demand. For there would obviously be a natural tendency towards the optimum employment of resources in a society which was functioning after the manner of the classical postulates. It may well be that the classical theory represents the way in which we should like our economy to behave. But to assume that it actually does so is to assume our difficulties away …

Thus the reduction in money-wages will have no lasting tendency to increase employment except by virtue of its repercussion either on the propensity to consume for the community as a whole, or on the schedule of marginal efficiencies of capital, or on the rate of interest. There is no method of analysing the effect of a reduction in money-wages, except by following up its possible effects on these three factors …

A reduction of money-wages will somewhat reduce prices. It will, therefore, involve some redistribution of real income (a) from wage-earners to other factors entering into marginal prime cost whose remuneration has not been reduced, and (b) from entrepreneurs to rentiers to whom a certain income fixed in terms of money has been guaranteed.

What will be the effect of this redistribution on the propensity to consume for the community as a whole? The transfer from wage-earners to other factors is likely to diminish the propensity to consume. The effect of the transfer from entrepreneurs to rentiers is more open to doubt. But if rentiers represent on the whole the richer section of the community and those whose standard of life is least flexible, then the effect of this also will be unfavourable. What the net result will be on a balance of considerations, we can only guess. Probably it is more likely to be adverse than favourable …

It follows, therefore, that if labour were to respond to conditions of gradually diminishing employment by offering its services at a gradually diminishing money-wage, this would not, as a rule, have the effect of reducing real wages and might even have the effect of increasing them, through its adverse influence on the volume of output. The chief result of this policy would be to cause a great instability of prices, so violent perhaps as to make business calculations futile in an economic society functioning after the manner of that in which we live. To suppose that a flexible wage policy is a right and proper adjunct of a system which on the whole is one of laissez-faire, is the opposite of the truth.

John Maynard Keynes: General Theory

Den ensamma människan

4 februari, 2013 kl. 19:07 | Publicerat i Varia | Kommentarer inaktiverade för Den ensamma människan

Jag tror på den ensamma människan,
på henne som vandrar ensam,
som inte hundlikt löper till vittring,
som inte varglikt flyr för mänskovittring:
På en gång människa och anti-människa.

Hur nå gemenskap?
Fly den övre och yttre vägen:
Det som är boskap i andra är boskap också i dig.
Gå den undre och inre vägen:
Det som är botten i dig är botten också i andra.

Svårt att vänja sig vid sig själv.
Svårt att vänja sig av med sig själv.

Den som gör det skall ändå aldrig bli övergiven.
Den som gör det skall ändå alltid förbli solidarisk.
Det opraktiska är det enda praktiska
i längden.

Gunnar Ekelöf

Om vådan av att läsa nationalekonomi

4 februari, 2013 kl. 10:27 | Publicerat i Varia | 3 kommentarer

fgbbio_2För egen del … måste jag läsa lika nödvändigt som jag måste andas, och jag kan numera läsa så gott som allt som är någorlunda mänskligt skrivet och har innehåll …

Men bestämda undantag finnas, som jag aldrig kommer att kunna bemästra; liksom vissa människor med god aptit bli sjuka vid tanken på ål eller må illa av musslor. Jag kan inte läsa juridik; inte heller nationalekonomi …

Försöker jag mig på den sortens ting, inträda oförminskade de nybörjarfenomen som kunna hemsöka ett stackars ovilligt skolbarn: universum sjunker samman i grå lump, och själen fylles av kval; förståndet vrider sig, som om det sutte på ett halster, och hjärnan känns vattnig; och en trötthet, långt mer kompakt än all trötthet alstrad av någon som helst av mig känd form av arbete eller förströelse, har redan efter fem minuter hunnit fylla varje fiber av min varelse som en evighetens flod av bly.

Frans G Bengtsson

Ces lupanars de la pensée!

3 februari, 2013 kl. 21:00 | Publicerat i Economics | 3 kommentarer


Feign stupidity? Being what they are is good enough for these guys …

The Arrogance of Power

3 februari, 2013 kl. 17:46 | Publicerat i Politics & Society | Kommentarer inaktiverade för The Arrogance of Power

Wolfgang Schäuble is currently serving as the Federal Minister of Finance in Angela Merkel’s Second Cabinet. Schäuble became chairman of the CDU in 1988, but had to give up his post in 2000, when it was disclosed that the party had accepted cash donations of over DM 100,000 from the arms dealer and lobbyist Karlheinz Schreiber. Schäuble was succeeded as party chairman by Angela Merkel …

Go Canada Go!

3 februari, 2013 kl. 00:02 | Publicerat i Varia | 3 kommentarer



2 februari, 2013 kl. 19:55 | Publicerat i Varia | Kommentarer inaktiverade för Mästerverket



2 februari, 2013 kl. 19:03 | Publicerat i Varia | Kommentarer inaktiverade för Hours


Annie Lööf ohjälpligt på väg utför?

2 februari, 2013 kl. 17:48 | Publicerat i Politics & Society | Kommentarer inaktiverade för Annie Lööf ohjälpligt på väg utför?


Centern har haft möte. I 4 timmar. Därefter gäller det för partiledaren att ”tala till folket” för att försöka övertyga dem om att det faktiskt går att stoppa tillbaka tandkrämen i tuben. När jag lyssnade på Ekot 16:45 igår (2013-02-01) fick hon komma till tals:
”Och här är en hälsning till den samlande (sic!) vänstern! Våga inte kalla mig hård, kall och nyliberal! Lägg inte åsikter i min mun! Vill ni ha en fajt om rättvisa och solidaritet: Välkomna, handsken är kastad.”
Om jag är lite snäll antar jag att hon menar den samlade vänstern. Men jag kan faktiskt inte definiera den vänstern, men det kan kanske Center-Lööf? Enligt mig har vänstern inte varit samlad sedan 1914, i bästa fall, vilket är det stora problemet. Kastade handskar är lite löjligt och ålderdomligt, men jag gissar att det är i linje med att återgå till traditionalismen. Att börja dilla om solidaritet i centerpartiet är kanske lite ihåligt, men möjligen i linje med Alliansens totala strategi att lägga beslag på gamla värdeladdade ord från fiendesidan. Nåväl, jag tänker inte dra upp henne ur vaken nu heller.

Till Annie Lööf: Du är hård, kall och nyliberal!

Kolla! Jag vågade!


What does ”holding constant” mean in regression analysis?

1 februari, 2013 kl. 17:53 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för What does ”holding constant” mean in regression analysis?

As a descriptive exercise, all is well. One can compare the average salary of men and women, holding constant potential confounders. The result is a summary of how salaries differ on the average by gender, conditional on the values of one or more covariates. Why the salaries may on the average differ is not represented explicitly in the regression model …

Berk_Regression_72ppiRGB_150pixelsMoving to causal inference is an enormous step that needs to be thoroughly considered. To begin, one must ponder … whether the causal variable of interest can be usefully conceptualized as an intervention within a response schedule framework [a formal structure in which to consider what the value of the response y would be if an input x were set to some vaue]. Once again consider gender. Imagine a particular faculty member. Now imagine intervening so that the faculty member’s gender could be set to ‘male.’ One would do this while altering nothing else about this person …

Clearly, the fit between the requisite response schedule and the academic  world in which salaries are determined fails for at least two reasons: The idea of setting gender to male or female is an enormous stretch, and even, if gender could be manipulated, it is hard to accept that only gender would be changed. In short, the causal story is in deep trouble even before the matter of holding constant surfaces …

This is not to imply that it never makes sense to apply regression-based adjustments in causal modeling. The critical issue is that the real world must cooperate by providing interventions that could be delivered separately …

As a technical move, it is easy to apply regression-based adjustmens to confounders. Whether it is sensible to do so is an entirely different matter …

The most demanding material [is] the examination of what it means to ‘hold constant’ … The problem [is] the potential incongruence between the mechanics of regression-based adjustments and the natural or social world under study.

« Föregående sidaNästa sida »

Blogga med
Entries och kommentarer feeds.