SNS Konjunkturråd 2015 — en bedrövlig läsning

19 Jan, 2015 at 22:41 | Posted in Economics, Politics & Society | 6 Comments

På DN Debatt skrev SNS Konjunkturråd 2015 häromdagen att “Stora skulder är på många sätt tecken på ett väl fungerande finansiellt system” och att vi inte behöver oroa oss över svenskarnas bolåneskuldsättning eftersom “svenska hushåll har fullt personligt betalningsansvar.”

Herre du milde! Och detta grodors plums och ankors plask ska man behöva läsa år 2015. Man tager sig för pannan!

Man känner sig inte helt trygg med SNS konjunkturråds rekommendationer för att skapa ett bättre finansiellt system. De flesta rekommendationerna bygger på filosofin att marknaden — med lite stöd från bättre statistik och konsumentskydd — själv kan förbättra sig. Konjunkturrådets filosofi liknar mycket det som på 1980-talet präglade ekonomernas syn på den då framväxande och nyligen avreglerade finansmarknaden. Avregleringen på finansmarknaden var som bekant huvudorsaken till den värsta krisen 1991-93 i Sverige sedan 1930-talet.
3Monkeys6I en ledare i Ekonomisk Debatt hösten 1985 (nr 4) kan man läsa om hur ekonomerna resonerade om 1980-talets avregleringar. På frågan om bankerna är en kommande krisbransch svarar ledaren blank nej: ”Bankerna är ingen kommande krisbransch om vi ser frågan utifrån bankväsendets makroekonomiska stabilitet”. Ledaren avvisar också explicit regleringar som skydd för konsumenterna eftersom de står i strid med de av marknaden initierade strukturförändringarna: ”Därför måste den avreglering som påbörjats fortsätta och genomföras med kraft och snabbhet”. Vidare konstateras att ”Bankinspektionens möjligheter att övervaka utvecklingen minskar inte heller genom en avreglering. Tvärtom…”. Ledaren i Ekonomisk Debatt avslutas med: ”Kunskap är viktigare än regleringar för att påverka de finansiella marknaderna”. Samma tongångar då som nu?

Stig Tegle

Hushållens skuldsättning bottnar främst i den ökning av tillgångsvärden som letts av ökad långivning till hushållen och den därav uppkomna bostadsbubblan. På lång sikt är det självklart inte möjligt att bibehålla denna trend. Tillgångspriserna avspeglar i grunden förväntningar om framtida avkastning på investeringar.

hushållsskulder 2014
Källa: SCB och egna beräkningar

Med den skuldkvot vi ser hushållen tagit på sig riskerar vi få en skulddeflationskris som kommer att slå oerhört hårt mot svenska hushåll.

År 1638 kunde priset på en tulpanlök i Nederländerna vara så högt att det motsvarade två årslöner. Och hushållen var övertygade om att priserna bara skulle fortsätta att öka och öka. Som alla andra bubblor sprack dock även denna bubbla och lämnade mängder av utblottade och ruinerade människor efter sig. Liknande ting utspelade sig i exempelvis Mississippibubblan år 1720 och i IT-bubblan för tio år sedan. Hur svårt ska det vara att lära av historien?

Reala bostadsrättspriser har stigit med omkring 900 % de senaste 30 åren. Om detta inte utgör ett hot mot “den finansiella stabiliteten” vet jag inte vad som skulle kunna göra det!

I will argue here that the financial catastrophe that we have just experienced powerfully illustrates a reason why extrapolating from natural experiments will inevitably be hazardous. The misinterpretation of historical data that led rating agencies, investors, and even myself to guess that home prices would decline very little and default rates would be tolerable even in a severe recession should serve as a caution for all applied econometrics.

Ed Leamer

Added 20/1: Roine Vestman noterar i ett inlägg på ekonomistas apropå rapportens slutsats att sannolikheten för ett negativt makrostabilt scenario skulle vara liten, att här “finns utrymme för andra tolkningar.” Jäpp!

Distributive effects of financial deregulation

19 Jan, 2015 at 19:57 | Posted in Economics | Comments Off on Distributive effects of financial deregulation

One of the greatest sources of wealth for the top 0.1% class of super-rich is Wall Street, and Wall Street is also the source of the financial crisis that inflicted much economic pain on the bottom 90% in recent years. This has understandably led to a widespread feeling in American society that the rules governing Wall Street are stacked in favor of a small elite, at the expense of Main Street.

However, much of the recent academic work in economics ignores the distributive effects of financial regulation and focuses solely on efficiency. By contrast, our recent paper on “The Redistributive Effects of Financial Deregulation” in the Journal of Monetary Economics puts the focus squarely on distributional considerations.

obama-financial-regulationOur analysis is based on the observation that losses in the financial sector can impose massive costs on the real economy … During the 2008 financial crisis, banks took large losses which raised interest rate spreads and lowered access to credit for the real economy, which in turn reduced the earnings of workers. When financial institutions decide how much risk to take on, they do not take into account these losses. Instead, they take on more risk than is good for the rest of the economy …

We argue in the paper that the most insidious consequence of bailouts is not that they lead to an explicit transfer from Main Street to Wall Street but that they could lead to an even larger implicit transfer by encouraging greater risk-taking and thereby exposing the economy to more credit crunches. Moreover, it may be difficult to commit to not providing bailouts once a financial crisis has occurred because the real sector may prefer to provide a bailout rather than suffer a severe credit crunch. By contrast, regulating risk-taking directly does not suffer from this commitment problem …

How can we better protect Main Street from the externalities of Wall Street? The simplest way is to regulate risk-taking by banks, whether by increasing their capital adequacy requirements, separating risky investment activities like proprietary trading from systemically important traditional banking, limiting payouts that endanger the capitalization of the financial sector, or using structural policies including limits on asymmetric compensation schemes to reduce incentives for risk-taking. Unfortunately, the current movement towards rolling back financial regulation may serve the interests of Wall Street well but continues to expose Main Street to the risk of financial meltdowns.

Anton Korinek & Jonathan Kreamer

Kausalitet och korrelation — exemplet friskolor

19 Jan, 2015 at 13:48 | Posted in Education & School | Comments Off on Kausalitet och korrelation — exemplet friskolor

causation
När vi i Sverige 1992 genomförde en friskolereform fick familjer därigenom över lag större möjlighet att själva välja var man ville sätta sina barn i skola. I linje med det av Milton Friedman redan på 1950-talet förespråkade införandet av skolpeng (voucher) underlättades etablerandet av friskolor väsentligt.

Friskolorna har som följd av denna friskolereform – inte minst på senare år – ökat sin andel av skolmarknaden markant. Idag utbildas mer än 10 % av landets grundskoleelever vid en friskola och nästan 25 % av gymnasieeleverna får sin utbildning vid friskolor.

Friskoleexpansionen har dock rent geografiskt sett väldigt olika ut. Idag saknar lite mer än en tredjedel av kommunerna friskolor på grundskolenivå och två tredjedelar av kommunerna saknar friskolor på gymnasienivå. Och i genomsnitt har elever vid friskolor föräldrar med högre utbildningsnivå och inkomster än eleverna vid kommunala skolor.

Mot bland annat denna bakgrund har det bland forskare, utbildningsanordnare, politiker m.fl. blivit intressant att försöka undersöka vilka konsekvenser friskolereformen haft.

Nu är det självklart inte helt lätt att göra en sådan bedömning med tanke på hur mångfacetterade och vittomfattande de mål är som satts upp för skolverksamheten i Sverige.

Ett vanligt mål som man fokuserat på är elevernas prestationer i form av uppnående av olika kunskapsnivåer. När man genomförde friskolereformen var ett av de ofta framförda argumenten att friskolorna skulle höja elevernas kunskapsnivåer, både i friskolorna (”den direkta effekten”) och – via konkurrenstrycket – i de kommunala skolorna (”den indirekta effekten”). De kvantitativa mått man använt för att göra dessa värderingar är genomgående betyg och/eller resultat på nationella prov.

Vid en första anblick kan det kanske förefalla trivialt att göra sådana undersökningar. Det är väl bara att – kan det tyckas – plocka fram data och genomföra nödiga statistiska tester och regressioner.

Riktigt så enkelt är det nu inte. I själva verket är det – som Pontus Bäckström så förtjänstfullt visar – väldigt svårt att få fram entydiga kausala svar på den här typen av frågor:

För någon vecka sedan skrev Mats Edman, chefredaktör på SKL-tidningen Dagens Samhälle, en krönika i vilken han drog slutsatsen att fristående skolor är mycket bättre än de kommunala skolorna … Bland annat visar han att de fristående skolornas elever i snitt har 18 poäng högre meritvärde än de som gått kommunala skolor. Han visar också att de kommunala skolorna är starkt överrepresenterade bland de skolor som presterar sämst och de fristående skolorna bland de som presterar bäst.

Kritiken har dock inte låtit vänta på sig … Det är en alldeles för banal analys som ligger till grund för Edmans slutsatser i och med att han inte kontrollerar dessa skillnader mot skolornas elevsammansättning …

Syftet med detta inlägg är primärt att visa hur stor del av denna ”friskoleeffekt” som kan förklaras av de fristående skolornas elevsammansättning. För att åstadkomma en analys med pedagogiska och förhållandevis lättbegripliga resultat har jag därför gjort en regressionsanalys som först bara mäter den ”rena” friskoleeffekten. Detta görs genom att använda en dikotom variabel för huvudman (dvs en variabel som bara kan anta värdet 1 eller 0 (1 = fristående)).

Den okontrollerade medelvärdesskillnaden mellan kommunala och fristående skolor var drygt 18 poäng och det är den skillnaden som framgår av modell 1 …

I modell 2 tillförs sedan ett antal bakgrundsvariabler …Här är vi endast intresserade av att ta reda på hur ”friskoleffekten” förändras under kontroll för skolornas olika elevsammansättning.

Detta tar vi reda på genom att dividera den nya effektstorleken för huvudman (från modell 2) med den ursprungliga (från modell 1), därigenom kan vi se hur stor andel av den ursprungliga effekten som ”kontrollerats bort” av variablerna för elevsammansättning. I detta fall har alltså knappa 80 % av den ursprungliga effekten kontrollerats bort.

Som vi ser i B-koefficienten för huvudman i modell 2 återstår en oförklarad skillnad om ungefär 4 meritvärdespoäng, vilket mycket väl skulle kunna vara ett resultat av att fristående skolor är ”bättre” på det sätt Edman tänker sig. Samtidigt ska vi vara ödmjuka inför det faktum att det fortsatt finns en hel drös aspekter vi inte kontrollerat för även i dessa analyser, tex vilka lärare som arbetar på vilka skolor.

Ska man entydigt kunna visa att det föreligger effekter och att dessa är ett resultat av just friskolornas införande – och inget annat – måste man identifiera och därefter kontrollera för påverkan från alla ”störande bakgrundsvariabler” av typen föräldrars utbildning, socioekonomisk status, etnicitet, geografisk hemhörighet, religion m.m. – så att vi kan vara säkra på att det inte är skillnader i dessa variabler som är de i fundamental mening verkliga kausalt bakomliggande förklaringarna till eventuella genomsnittliga effektskillnader.

Idealt sett skulle vi, för att verkligen vinnlägga oss om att kunna göra en sådan kausalanalys, vilja genomföra ett experiment där vi plockar ut en grupp elever och låter dem gå i friskolor och efter en viss tid utvärderar effekterna på deras kunskapsnivåer. Sedan skulle vi vrida tillbaka klockan och låta samma grupp av elever istället gå i kommunala skolor och efter en viss tid utvärdera effekterna på deras kunskapsnivåer. Genom att på detta experimentvis kunna isolera och manipulera undersökningsvariablerna så att vi verkligen kan säkerställa den unika effekten av friskolor – och inget annat – skulle vi kunna få ett exakt svar på vår fråga.

Eftersom tidens pil bara går i en riktning inser var och en att detta experiment aldrig går att genomföra i verkligheten.

Det nästbästa alternativet skulle istället vara att slumpmässigt dela in elever i grupper: en med elever som får gå i friskolor (”treatment”) och en med elever som får gå i kommunala skolor (”control”). Genom randomiseringen förutsätts bakgrundsvariablerna i genomsnitt vara identiskt likafördelade i de båda grupperna (så att eleverna i de båda grupperna i genomsnitt inte skiljer sig åt i vare sig observerbara eller icke-observerbara hänseenden) och därigenom möjliggöra en kausalanalys där eventuella genomsnittliga skillnader mellan grupperna kan återföras på (”förklaras av”) om man gått i friskola eller i kommunal skola.

Problemet är bara att man kan ifrågasätta om dessa så kallade randomiserade kontrollstudier är evidentiellt relevanta när vi exporterar resultaten från ”experimentsituationen” till en ny målpopulation. Med andra konstellationer av bakgrunds- och stödfaktorer säger oss den genomsnittliga effekten i en randomiserad kontrollstudie troligen inte mycket, och kan därför inte heller i någon större utsträckning vägleda oss i frågan om vi ska genomföra en policy/åtgärdsprogram eller ej.

Det i särklass vanligaste undersökningsförfarandet är – som  i Bäckströms analys – att man genomför en traditionell multipel regressionsanalys baserad på så kallade minstakvadrat (OLS) eller maximum likelihood (ML) skattningar av observationsdata, där man försöker ”konstanthålla” ett antal specificerade bakgrundsvariabler för att om möjligt kunna tolka regressionskoefficienterna i kausala termer. Vi vet att det föreligger risk för ett ”selektionsproblem” eftersom de elever som går på friskolor ofta skiljer sig från de som går på kommunala skolor vad avser flera viktiga bakgrundsvariabler, kan vi inte bara rakt av jämföra de två skolformerna kunskapsnivåer för att därur dra några säkra kausala slutsatser. Risken är överhängande att de eventuella skillnader vi finner och tror kan förklaras av skolformen, i själva verket helt eller delvis beror på skillnader i de bakomliggande variablerna (t.ex. bostadsområde, etnicitet, föräldrars utbildning, m.m.)

Ska man försöka sig på att sammanfatta de regressionsanalyser som genomförts är resultatet – precis som i Bäckströms exempel – att de kausala effekter på elevers prestationer man tyckt sig kunna identifiera av friskolor genomgående är små (och ofta inte ens statistiskt signifikanta på gängse signifikansnivåer). Till detta kommer också att osäkerhet råder om man verkligen kunnat konstanthålla alla relevanta bakgrundsvariabler – Bäckström nämner t. ex. lärarnas olika kompetens – och att därför de skattningar som gjorts ofta i praktiken är behäftade med otestade antaganden och en icke-försumbar osäkerhet och ”bias” som gör det svårt att ge en någorlunda entydig värdering av forskningsresultatens vikt och relevans. Enkelt uttryckt skulle man kunna säga att många – kanske de flesta – av de effektstudier av detta slag som genomförts, inte lyckats skapa tillräckligt jämföra grupper, och att – eftersom detta strikt sett är absolut nödvändigt för att de statistiska analyser man de facto genomför ska kunna tolkas på det sätt man gör – värdet av analyserna därför är svårt att fastställa. Det innebär också – och här ska man även väga in möjligheten av att det kan föreligga bättre alternativa modellspecifikationer (speciellt vad gäller ”gruppkonstruktionerna” i de använda urvalen) – att de ”känslighetsanalyser” forskare på området regelmässigt genomför, inte heller ger någon säker vägledning om hur pass ”robusta” de gjorda regressionsskattningarna egentligen är. Vidare är det stor risk för att de latenta, bakomliggande, ej specificerade variabler som representerar karakteristika som ej är uppmätta (intelligens, attityd, motivation m.m.) är korrelerade med de oberoende variabler som ingår i regressionsekvationerna och därigenom leder till ett problem med endogenitet.

Forskningen har inte generellt kunnat belägga att införandet av friskolor och ökad skolkonkurrens lett till några större effektivitetsvinster eller påtagligt ökade kunskapsnivåer hos eleverna i stort. De uppmätta effekterna är små och beror till stor del på hur de använda modellerna specificeras och hur de ingående variablerna mäts och vilka av dem som ”konstanthålls”. Det går således inte heller att säkerställa att de effekter man tyckt sig kunna detektera vad gäller resultatförbättringar i friskolor skulle bero på friskolorna som sådana. Metodologiskt har det visat sig vara svårt att konstruera robusta och bra kvalitetsmått och mätinstrument som möjliggör en adekvat hantering av alla de olika faktorer – observerbara och icke-observerbara – som påverkar konkurrensen mellan skolformerna och ger upphov till eventuella skillnader i elevprestationer mellan skolformerna. Följden blir att de små effekter man (i vissa undersökningar) kunnat konstatera föreligga sällan är behäftade med någon högre grad av evidentiell ”warrant”. Mycket av forskningsresultaten baseras på både otestade och i grunden otestbara modellantaganden (t.ex. vad avser linearitet, homogenitet, additivitet, icke-förekomst av interaktionsrelationer, oberoende, bakgrundskontextuell neutralitet m.m.) Resultaten är genomgående av en tentativ karaktär och de slutsatser forskare, politiker och opinionsbildare kan dra av dem bör därför återspeglas i en ”degree of belief” som står i paritet med denna deras epistemologiska status.

We are the 100%

19 Jan, 2015 at 10:44 | Posted in Economics | 1 Comment

we are the 100%

Debating “modern” macroeconomics I often get the feeling that mainstream economists when facing anomalies think that there is always some further “technical fix” that will get them out of the quagmire. But are these elaborations and amendments on something basically wrong really going to solve the problem? I doubt it. Acting like the baker’s apprentice who, having forgotten to add yeast to the dough, throws it into the oven afterwards, simply isn’t enough.

When criticizing the basic workhorse DSGE model for its inability to explain involuntary unemployment, DSGE defenders maintain that later elaborations — especially newer search models — manage to do just that. I strongly disagree.

One of the more conspicuous problems with those “solutions,” is that they — as e.g. Pissarides’ ”Loss of Skill during Unemployment and the Persistence of Unemployment Shocks” QJE (1992) — are as a rule constructed without seriously trying to warrant that the model immanent assumptions and results are applicable in the real world. External validity is more or less a non-existent problematique sacrificed on the altar of model derivations. This is not by chance. For how could one even imagine to empirically test assumptions such as Pissarides’ ”model 1″ assumptions of reality being adequately represented by ”two overlapping generations of fixed size”, ”wages determined by Nash bargaining”, ”actors maximizing expected utility”,”endogenous job openings”, ”jobmatching describable by a probability distribution,” without coming to the conclusion that this is — in terms of realism and relevance — nothing but nonsense on stilts?

Brad DeLong and the true nature of neoclassical economics

18 Jan, 2015 at 15:57 | Posted in Economics | 5 Comments

I think that modern neoclassical economics is in fine shape as long as it is understood as the ideological and substantive legitimating doctrine of the political theory of possessive individualism. As long as we have relatively-self-interested liberal individuals who have relatively-strong beliefs that things are theirs, the competitive market in equilibrium is an absolutely wonderful mechanism for achieving truly extraordinary degree of societal coordination and productivity. We need to understand that. We need to value that. And that is what neoclassical economics does, and does well.

Of course, there are all the caveats to Arrow-Debreu-Mackenzie:

adb_poster_red_kickitover1   The market must be in equilibrium.
2   The market must be competitive.
3   The goods traded must be excludable.
4   The goods traded must be non-rival.
5   The quality of goods traded and of effort delivered must be known, or at least bonded, for adverse selection and moral hazard are poison.
6   Externalities must be corrected by successful Pigovian taxes or successful Coaseian carving of property rights at the joints.
7   People must be able to accurately calculate their own interests.
8   People must not be sadistic–the market does not work well if participating agents are either the envious or the spiteful.
9   The distribution of wealth must correspond to the societal consensus of need and desert.
10 The structure of debt and credit must be sound, or if it is not sound we need a central bank or a social-credit agency to make it sound and so make Say’s Law true in practice even though we have no reason to believe Say’s Law is true in theory.

Brad DeLong

An impressive list of caveats indeed. Not very much value left of “modern neoclassical economics” if you ask me …

what ifStill — almost a century and a half after Léon Walras founded neoclassical general equilibrium theory — “modern neoclassical economics” hasn’t been able to show that markets move economies to equilibria.

We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. One however has to ask oneself — what good does that do?

As long as we cannot show, except under exceedingly special assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria — the value of general equilibrium theory is negligible. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming “Santa Claus” conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids, and general equilibrium economists ought to grow up.

Continuing to model a world full of agents behaving as economists — “often wrong, but never uncertain” — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away) is a gross misallocation of intellectual resources and time.

And then, of course, there is Sonnenschein-Mantel-Debreu!

So what? Why should we care about Sonnenschein-Mantel-Debreu?

Because  Sonnenschein-Mantel-Debreu ultimately explains why “modern neoclassical economics” — New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and “New Keynesian” — with its microfounded macromodels are such bad substitutes for real macroeconomic analysis!

These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the Sonnenschein-Mantel-Debreu theorem points to. Representative agent models are — as I have argued at length here — rather an evasion whereby issues of distribution, coordination, heterogeneity — everything that really defines macroeconomics — are swept under the rug.

Instead of real maturity, we see that general equilibrium theory possesses only pseudo-maturity.kornai For the description of the economic system, mathematical economics has succeeded in constructing a formalized theoretical structure, thus giving an impression of maturity, but one of the main criteria of maturity, namely, verification, has hardly been satisfied. In comparison to the amount of work devoted to the construction of the abstract theory, the amount of effort which has been applied, up to now, in checking the assumptions and statements seems inconsequential.

Milton Friedman on econometric ‘groping in the dark’

16 Jan, 2015 at 19:12 | Posted in Statistics & Econometrics | Comments Off on Milton Friedman on econometric ‘groping in the dark’

Being sorta-kinda Keynesian, yours truly doesn’t often find the occasion to approvingly quote Milton Friedman. But on this issue I have no problem:

Granted that the final result will be capable of being expressed in the form of a system of simultaneous equations applying to the economy as a whole, it does not follow that the best way to get to that final result is by seeking to set such a system down now. As I am sure those who have tried to do so will agree, we now know so little about the dynamic mechanisms at work that there is enormous arbitrariness in any system set down. Limitations of resources – mental, computational, and statistical – enforce a model that, although complicated enough for our capacities, is yet enormously simple relative to the present state of understanding of the world we seek to explain. GropingUntil we can develop a simpler picture of the world, by an understanding of interrelations within sections of the economy, the construction of a model for the economy as a whole is bound to be almost a complete groping in the dark. The probability that such a process will yield a meaningful result seems to me almost negligible.

Milton Friedman (1951)

Statistical modeling and reality

15 Jan, 2015 at 11:24 | Posted in Statistics & Econometrics | 4 Comments

My critique is that the currently accepted notion of a statistical model is not scientific; rather, it is a guess at what might constitute (scientific) reality without the vital element of feedback, that is, without checking the hypothesized, postulated, wished-for, natural-looking (but in fact only guessed) model against that reality. To be blunt, as far as is known today, there is no such thing as a concrete i.i.d. (independent, identically distributed) process, not because this is not desirable, nice, or even beautiful, but because Nature does not seem to be like that … Kalman_dummiesAs Bertrand Russell put it at the end of his long life devoted to philosophy, “Roughly speaking, what we know is science and what we don’t know is philosophy.” In the scientific context, but perhaps not in the applied area, I fear statistical modeling today belongs to the realm of philosophy.

To make this point seem less erudite, let me rephrase it in cruder terms. What would a scientist expect from statisticians, once he became interested in statistical problems? He would ask them to explain to him, in some clear-cut cases, the origin of randomness frequently observed in the real world, and furthermore, when this explanation depended on the device of a model, he would ask them to continue to confront that model with the part of reality that the model was supposed to explain. Something like this was going on three hundred years ago … But in our times the idea somehow got lost when i.i.d. became the pampered new baby.

Rudolf Kalman

What does ‘autonomy’ mean in econometrics?

15 Jan, 2015 at 11:07 | Posted in Economics | Comments Off on What does ‘autonomy’ mean in econometrics?

The point of the discussion, of course, has to do with where Koopmans thinks we should look for “autonomous behaviour relations”. He appeals to experience but in a somewhat oblique manner. He refers to the Harvard barometer “to show that relationships between economic variables … not traced to underlying behaviour equations are unreliable as instruments for prediction” … His argument would have been more effectively put had he been able to give instances of relationships that have been “traced to underlying behaviour equations” and that have been reliable instruments for prediction. He did not do this, and I know of no conclusive case that he could draw upon. There are of course cases of economic models that he could have mentioned as having been unreliable predictors. But these latter instances demonstrate no more than the failure of Harvard barometer: all were presumably built upon relations that were more or less unstable in time. devoidThe meaning conveyed, we may suppose, by the term “fundamental autonomous relation” is a relation stable in time and not drawn as an inference from combinations of other relations. The discovery of such relations suitable for the prediction procedure that Koopmans has in mind has yet to be publicly presented, and the phrase “underlying behaviour equation” is left utterly devoid of content.

Rutledge Vining

If only Robert Lucas had read Vining …

Why is capitalism failing?

14 Jan, 2015 at 18:46 | Posted in Economics | Comments Off on Why is capitalism failing?

 

‘New Keynesian’ haiku economics

13 Jan, 2015 at 15:51 | Posted in Economics | 7 Comments

The neoliberal hegemony of the Reagan-Thatcher certainly decreased the debate among mainstream economists decline:

It is clear that the Great Depression and the Keynesian Revolution seemed to increase debate within the mainstream, and that as Joe says, the: “decline in debate… appears to have been associated with the emergence of a ‘neoliberal’ hegemony from the 1970s onwards.” That’s essentially correct.

And the decline in debate explains why Lucas could say in the early 1980s that: “at research seminars, people don’t take Keynesian theorizing seriously anymore; the audience starts to whisper and giggle to one another.” And also why if you wanted to publish you basically had to accept the crazy New Classical models. Krugman admitted to that before, as I’ve already noticed. He argued that: “the only way to get non-crazy macroeconomics published was to wrap sensible assumptions about output and employment in something else, something that involved rational expectations and intertemporal stuff and made the paper respectable.” You must remember, you don’t publish, you don’t get tenure. So crazy models became the norm.

haiku-in-japaneseNot only heterodox economists were kicked out of mainstream departments, and had to create their own journals in the 1970s, but also the pressure within the mainstream to conform and silence dissent was strong indeed. Note that many, like Blanchard and Woodford for example, in the mainstream continue to suggest that there is a lot of consensus between New Keynesians, and Real Business Cycles types. In fact, the say there is more agreement now than in the 1970s. How is the consensus methodology in macroeconomics, you ask. From Blanchard’s paper above:

“To caricature, but only slightly: A macroeconomic article today often follows strict, haiku-like, rules: It starts from a general equilibrium structure, in which individuals maximize the expected present value of utility, firms maximize their value, and markets clear. Then, it introduces a twist, be it an imperfection or the closing of a particular set of markets, and works out the general equilibrium implications. It then performs a numerical simulation, based on calibration, showing that the model performs well. It ends with a welfare assessment.”

And yes that is also the basis of New Keynesian models. The haiku basically describes the crazy models in which reasonable results must be disguised if you’re to be taken seriously in academia. When everybody agrees, there is little need for debate. And you get stuck with crazy models. The lack of debate within the mainstream to this day is also, in part, what provides support for austerity policies around the globe, even when it is clear that they have failed.

Matias Vernengo/Naked Keynesianism

“Stuck with crazy models”? Absolutely! Let me just give one example.

A lot of mainstream economists out there still think that price and wage rigidities are the prime movers behind unemployment. What is even worse — I’m totally gobsmacked every time I come across this utterly ridiculous misapprehension — is that some of them even think that these rigidities are the reason John Maynard Keynes gave for the high unemployment of the Great Depression. This is of course pure nonsense. For although Keynes in General Theory devoted substantial attention to the subject of wage and price rigidities, he certainly did not hold this view.

Since unions/workers, contrary to classical assumptions, make wage-bargains in nominal terms, they will – according to Keynes – accept lower real wages caused by higher prices, but resist lower real wages caused by lower nominal wages. However, Keynes held it incorrect to attribute “cyclical” unemployment to this diversified agent behaviour. During the depression money wages fell significantly and – as Keynes noted – unemployment still grew. Thus, even when nominal wages are lowered, they do not generally lower unemployment.

In any specific labour market, lower wages could, of course, raise the demand for labour. But a general reduction in money wages would leave real wages more or less unchanged. The reasoning of the classical economists was, according to Keynes, a flagrant example of the “fallacy of composition.” Assuming that since unions/workers in a specific labour market could negotiate real wage reductions via lowering nominal wages, unions/workers in general could do the same, the classics confused micro with macro.

Lowering nominal wages could not – according to Keynes – clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. But to Keynes it would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen by Keynes as a general substitute for an expansionary monetary or fiscal policy.

Even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

So, what Keynes actually did argue in General Theory, was that the classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong.

To Keynes, flexible wages would only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labor market.

The classical school [maintains that] while the demand for labour at the existing money-wage may be satisfied before everyone willing to work at this wage is employed, this situation is due to an open or tacit agreement amongst workers not to work for less, and that if labour as a whole would agree to a reduction of money-wages more employment would be forthcoming. If this is the case, such unemployment, though apparently involuntary, is not strictly so, and ought to be included under the above category of ‘voluntary’ unemployment due to the effects of collective bargaining, etc …
The classical theory … is best regarded as a theory of distribution in conditions of full employment. So long as the classical postulates hold good, unemploy-ment, which is in the above sense involuntary, cannot occur. Apparent unemployment must, therefore, be the result either of temporary loss of work of the ‘between jobs’ type or of intermittent demand for highly specialised resources or of the effect of a trade union ‘closed shop’ on the employment of free labour. Thus writers in the classical tradition, overlooking the special assumption underlying their theory, have been driven inevitably to the conclusion, perfectly logical on their assumption, that apparent unemployment (apart from the admitted exceptions) must be due at bottom to a refusal by the unemployed factors to accept a reward which corresponds to their marginal productivity …

Obviously, however, if the classical theory is only applicable to the case of full employment, it is fallacious to apply it to the problems of involuntary unemployment – if there be such a thing (and who will deny it?). The classical theorists resemble Euclidean geometers in a non-Euclidean world who, discovering that in experience straight lines apparently parallel often meet, rebuke the lines for not keeping straight – as the only remedy for the unfortunate collisions which are occurring. Yet, in truth, there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics. We need to throw over the second postulate of the classical doctrine and to work out the behaviour of a system in which involuntary unemployment in the strict sense is possible.

J M Keynes General Theory

People calling themselves ‘New Keynesians’ ought to be rather embarrassed by the fact that the kind of microfounded dynamic stochastic general equilibrium models they use, cannot incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfunded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility. Maybe that’s also the reason prominent ‘New Keynesian’ macroeconomist Simon Wren-Lewis can write

I think the labour market is not central, which was what I was trying to say in my post. It matters in a [New Keynesian] model only in so far as it adds to any change to inflation, which matters only in so far as it influences central bank’s decisions on interest rates.

In the basic DSGE models used by most ‘New Keynesians’, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its “equilibrium value,” the representative agent adjust her labour supply, so that when the real wage is higher than its “equilibrium value,” labour supply is increased, and when the real wage is below its “equilibrium value,” labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than “hand waving” that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

To Keynes this was self-evident. But obviously not so to haiku-rule-following ‘New Keynesians’.

Lördagsmorgon i P2 — en lisa för själen

13 Jan, 2015 at 11:30 | Posted in Varia | Comments Off on Lördagsmorgon i P2 — en lisa för själen

radioI dessa tider — när ljudrummet dränks i den kommersiella radions tyckmyckentrutade ordbajseri och fullständigt intetsägande melodifestivalskval — har man ju nästan gett upp.

Men det finns ljus i mörkret! I radions P2 går varje lördagmorgon ett vederkvickelsens och den seriösa musikens Lördagmorgon i P2.

Så passa på och börja dagen med en musikalisk örontvätt och rensa hörselgångarna från kvarvarande musikslagg. Här kan man till exempel lyssna på musik av Vassilis Tsabropoulos, John Tavener, Gustav Mahler, Arvo Pärt och — inte minst — Stefan Nilsson. Att i tre timmar få lyssna till sådan musik ger sinnet ro och får hoppet att återvända. Tack public-service-radio!

Och tack Erik Schüldt. Att i tre timmar få lyssna till underbar musik och en programledare som har något att säga och inte bara låter foderluckan glappa hela tiden — vilken lisa för själen!

University of Greenwich shows the way!

13 Jan, 2015 at 11:20 | Posted in Economics | Comments Off on University of Greenwich shows the way!

The last seven years have not been easy for the global economy as well as the teaching of economics. The recent financial crisis and the Great Recession have led many economists, non-economists and students in economics to question the state of the discipline, wondering to what extent it provides the necessary tools to interpret the complex world we live in, signalling a deep dissatisfaction with economists’ ability to provide solutions to real world problems. Employers have recognised that the economics graduates that the standard curriculum generates are not equipped with the skills that the real world requires. Likewise, students themselves have recognised that the tools and theories they learn don’t enable them to make sense of the world they live in, let alone to address and solve real world problems …

charles-schulz-peanuts-think-bigThe reason the revalidation of the economics programmes at the University of Greenwich is special is that it constitutes one of the first institutional responses to current pressures from students, faculty, employers and policy makers to produce more ‘world-ready’ graduates. In redesigning our economics programmes we – the economics programmes team – have decided to:

– Address socially relevant economic questions in all core economic courses by adopting a historical and pluralistic perspective right from the start and throughout the programme.

– Add two new compulsory courses – Economic History in the first year and History of Economic Thought in the second year, and an optional course Political Economy of International Development and Finance in the third year.

– Integrate the concept of environmental and social sustainability – in the teaching of economics in all courses, as well as provide specific courses such as Environmental Economics and Environmental Regulation and Business Ethics and Corporate Social Responsibility.

– Eliminate from the curriculum those topics that tend to be taught by default just because they appear on standard economics textbooks rather than because they are recognised as truly useful in understanding how economies really work.

However, we do not isolate the development of a pluralistic perspective to only a few courses, but rather integrate it in all our courses by approaching real world problems from the perspective of different theories, both old and contemporary, comparing, contrasting, or at times synthesising them. This should help the students to develop a critical perspective towards current economic theories and evolving economic events, and develop an understanding about the limitations of theories and models (for example, what happens out of equilibrium), and think more widely about the historical, institutional and political context of economic behaviour and policies …

Sara Gorgoni

‘New Keynesianism’ — neat, plausible and wrong

11 Jan, 2015 at 17:10 | Posted in Economics | 8 Comments

Maintaining that economics is a science in the “true knowledge” business, yours truly remains a skeptic of the “New Keynesian” pretences and aspirations of people like Paul Krugman, Simon Wren-Lewis and Greg Mankiw.

kKeynes basically argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified.” Dynamic stochastic general euilibrium (DSGE) macroeconomists — including “New Keynesians” — has drawn the conclusion that the problem with unstable relations is to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of posts (e. g. here and here), this, however, is a dead end.

Where “New Keynesian” economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they have to put a blind eye on the emergent properties that characterize all open social systems – including the economic system. The interaction between animal spirits, trust, confidence, institutions etc., cannot be deduced or reduced to a question answerable on the individual level. Macroeconomic structures and phenomena have to be analyzed also on their own terms.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, even these microfoundations aren’t immutable. The “deep parameters” of “New Keynesian” DSGE models – “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

So I cannot concur with Paul Krugman, Simon Wren-Lewis, Greg Mankiw and other sorta-kinda “New Keynesians” when they more or less try to reduce Keynesian economics to “intertemporal maximization modified with sticky prices and a few other deviations”. ” As John Quiggin so aptly writes:

If there is one thing that distinguished Keynes’ economic analysis from that of his predecessors, it was his rejection of the idea of a unique full employment equilibrium to which a market economy will automatically return when it experiences a shock. Keynes argued that an economy could shift from a full-employment equilibrium to a persistent slump as the result of the interaction between objective macroeconomic variables and the subjective ‘animal spirits’ of investors and other decision-makers. It is this perspective that has been lost in the absorption of New Keynesian macro into the DSGE framework.

Krugman & Wren-Lewis flim-flamming on heterodox assaults on mainstream economics

10 Jan, 2015 at 18:10 | Posted in Economics | 3 Comments

Simon Wren-Lewis is not satisfied with heterodox economists’s attacks on the mainstream. He’s even annoyed:

The implication is that modern intertemporal New Keynesian theory is somehow behind the view that austerity will not harm a recovery.

This is absolute and dangerous nonsense. Having spent the last decade or two looking at fiscal policy in intertemporal New Keynesian models, I know that exactly the opposite is true. In these models temporary decreases in government spending have significant negative effects on output for given real interest rates … Anyhow anyone who says that mainstream New Keynesian theory supports austerity does not know what they are talking about.

And Paul Krugman seems to share his annoyance:

The point is that standard macroeconomics does NOT justify the attacks on fiscal stimulus and the embrace of austerity. On these issues, people like Simon and myself have been following well-established models and analyses, while the austerians have been making up new stuff and/or rediscovering old fallacies to justify the policies they want. Formal modeling and quantitative analysis doesn’t justify the austerian position; on the contrary, austerians had to throw out the models and abandon statistical principles to justify their claims.

flimflam-2But even if Simon and Paul do not generally defend (“expansionary” or not) austerity measures, there certainly are other mainstream “New Keynesian” economists that do. Greg Mankiw, e. g., has more than once defended austerity policies (here). There has to be some reason for this. If three self-proclaimed sort-kinda “New Keynesians” come up with different views on such a central macroeconomic issue, one has to legitimately ask what kind of theories and models that ilk of “Keynesianism” stands for.

Reading Wren-Lewis and Krugman ultimately reaffirms the impression of a macroeconomic framework that doesn’t succeed in giving a convincing analysis of what a modern capitalist economy is. Keynes’s macroeconomics was a theory for all seasons. It is not enough to put on some “Keynesian” glasses at the zero lower bound and then take them off and put on New Classical glasses once we’re out of that predicament.

Back in 1994 Laurence Ball and Greg Mankiw argued that

although traditionalists are often called ‘New Keynesians,’ this label is a misnomer. They could just as easily be called ‘New Monetarists.’

That is still true today. “New Keynesianism” is a gross misnomer. The macroeconomics of people like Greg Mankiw, Paul Krugman and Simon Wren-Lewis has theoretically and methodologically a lot to do with Milton Friedman, Robert Lucas and Thomas Sargent — and very little to do with the founder of macroeconomics, John Maynard Keynes.

Read my lips — validity is NOT enough!

10 Jan, 2015 at 13:49 | Posted in Economics | 1 Comment

Neoclassical economic theory today is in the story-telling business whereby economic theorists create make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. Since fully-fledged experiments on a societal scale as a rule are prohibitively expensive, ethically indefensible or unmanageable, economic theorists have to substitute experimenting with something else. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models” rather than engineering things happening in real economies.

Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality. As Julian Reiss writes:

errorineconomicsThere is a difference between having evidence for some hypothesis and having evidence for the hypothesis relevant for a given purpose. The difference is important because scientific methods tend to be good at addressing hypotheses of a certain kind and not others: scientific methods come with particular applications built into them … The advantage of mathematical modelling is that its method of deriving a result is that of mathemtical prof: the conclusion is guaranteed to hold given the assumptions. However, the evidence generated in this way is valid only in abstract model worlds while we would like to evaluate hypotheses about what happens in economies in the real world … The upshot is that valid evidence does not seem to be enough. What we also need is to evaluate the relevance of the evidence in the context of a given purpose.

Neoclassical economics has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability. To have valid evidence is not enough. What economics needs is sound evidence.

Extraordinarily absurd things called ‘Keynesian’

8 Jan, 2015 at 20:25 | Posted in Economics | 2 Comments

phil mirowskiToday, it seems, just about anyone can get away with calling themselves a Keynesian, and they do, no matter what salmagundi of doctrinal positions they may hold dear, without fear of ridicule or reproach. Consequently, some of the most extraordinarily absurd things are now being attributed to Keynes and called “Keynesian theories”. For instance, J. Bradford DeLong, a popular blogger and faculty member at Berkeley, has in a (2009) paper divided up the history of macroeconomics into what he identifies as a “Peel–Keynes–Friedman axis” and a “Marx–Hoover–Hayek” axis: clearly he has learned a trick or two from the neoliberals, who sow mass confusion by mixing together oil and water in their salad dressing versions of history. The self-appointed “New Keynesians” of the 1990s (including Gregory Mankiw, David Romer and Michael Woodford) took the name of Keynes in vain by unashamedly asserting a proposition that Keynes himself had repeatedly and expressly rejected, namely that market-clearing models cannot explain short-run economic fluctuations, and so proceeded to advocate models with “sticky” wages and prices (Mankiw, 2006). George Akerlof and Robert Shiller (2009) have taken three sentences from the General Theory out of context and spun it into some banal misrepresentation concerning what Keynes actually wrote about the notion of “animal spirits,” not to mention his actual conception of macroeconomics. And we observe contemporary journalists going gaga over Keynes, with almost no underlying substantive justification from the track record of the economics profession …

It is undeniably a Sisyphusian task to lean against this blustering tide of misrepresentation in the current Humpty Dumpty climate, with its gales of misinformation and gusts whipping about the turncoats, where economists harbor such easy contempt for history that words can be purported to mean anything that is convenient or politic for the selfish purposes of the writer.

Philip Mirowski

Phil has always been one of my favourite critics of neoclassical economics. I first met him twenty years ago, when he was invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defense. Being at a nonplus, one of them, in total desperation, finally asked “But what shall we do then?”

Yes indeed, what shall they do? Moments like that you never forget. It has stayed with me for all these years. The emperor turned out to be naked. Thanks Phil!

Let’s empty the econometric garbage can!

8 Jan, 2015 at 17:06 | Posted in Statistics & Econometrics | 3 Comments

This is where statistical analysis enters. Validation comes in many different forms, of course, and much good theory testing is qualitative in character. Yet when applicable, statistical theory is our most powerful inductive tool, and in the end, successful theories have to survive quantitative evaluation if they are to be taken seriously. Moreover, statistical analysis is not confined to theory evaluation. Quantitative analysis also discovers empirical generalizations that theory must account for. Scientific invention emerges from data and experiment as often as data and experiment are used to confirm prior theory …

garbageHow is all this empirical creativity and validation to be achieved? Most empirical researchers … believe that they know the answer. First, they say, decide which explanations of a given phenomenon are to be tested. One or more such hypotheses are set out. Then “control variables” are chosen— factors which also affect the phenomenon under study, but not in a way relevant to the hypotheses under discussion. Then measures of all these explanatory factors are entered into a regression equation (linearly), and each variable is assigned a coefficient with a standard error. Hypotheses whose factors acquire a substantively and statistically significant coefficient are taken to be influential, and those that do not are treated as rejected. Extraneous influences are assumed to be removed by the “controls” …

In the great majority of applied work with all these methods, a particular statistical distribution is specified for the dependent variable, conditional on the independent variables. The explanatory factors are postulated to exert their influence through one or more parameters, usually just the mean of the statistical distribution for the dependent variable. The function that connects the independent variables to the mean is known as the “link function” …

In practice, researchers nearly always postulate a linear specification as the argument of the link function … Computer packages often make this easy: One just enters the variables into the specification, and linearity is automatically applied. In effect, we treat the independent variable list as a garbage can: Any variable with some claim to relevance can be tossed in. Then we carry out least squares or maximum likelihood estimation (MLE) or Bayesian estimation or generalized method of moments, perhaps with the latest robust standard errors. It all sounds very impressive. It is certainly easy: We just drop variables into our mindless linear functions, start up our computing routines, and let ’er rip …

Linear link functions are not self-justifying. Garbage-can lists of variables entered linearly into regression, probit, logit, and other statistical models have no explanatory power without further argument. In the absence of careful supporting argument, the results belong in the statistical rubbish bin …

In sum, we need to abandon mechanical rules and procedures. “Throw in every possible variable” won’t work; neither will “rigidly adhere to just three explanatory variables and don’t worry about anything else.” Instead, the research habits of the profession need greater emphasis on classic skills that generated so much of what we know in quantitative social science: plots, crosstabs, and just plain looking at data. Those methods are simple, but sophisticatedly simple. They often expose failures in the assumptions of the elaborate statistical tools we are using, and thus save us from inferential errors.

Christopher H. Achen

This paper is one of my absolute favourites. Why? I guess it’s because Achen reaffirms my firm conviction that since there is no absolutely certain knowledge at hand in social sciences — including economics — explicit argumentation and justification ought to play an extremely strong role if the purported knowledge claims are to be sustainably warranted. Or as Achen puts it, without careful supporting arguments, “just dropping variables into SPSS, STATA, S or R programs accomplishes nothing.”

Stiglitz on the break down of marginal productivity theory

7 Jan, 2015 at 17:51 | Posted in Economics | Comments Off on Stiglitz on the break down of marginal productivity theory

Lynn Parramore: Many neoclassical economists have argued that when people contribute to the economy, they get rewarded proportionally. Is this model breaking down?

Joseph Stiglitz: Yes. I think that the thrust of my book, The Price of Inequality, and a lot of other work has been to question the margin of productivity theory, which is a theory that has been prevalent for 200 years. A lot of people have questioned it, but my work is a renewal of questioning. And I think that some of the very interesting work that Piketty and his associates have done is providing some empirical basis for doing it. Not only the example that I just gave that if you look at the people at the top, monopolists actually constrain output.

reality-header2

It’s also true that people who make the most productive contributions, the ones who make lasers or transistors, or the inventor of the computer, DNA researchers — none of these are the top wealthiest people in the country. So if you look at the people who contributed the most, and the people who are there at the top, they’re not the same. That’s the second piece.

A very interesting study that Piketty and his associates did was on the effect of an increase in taxes on the top 1 percent. If you had the hypothesis that these were people who were working hard and contributing more, you might say, OK, that’s going to significantly slow down the economy. But if you say it’s rent-seeking, then you’re just capturing for the government some of the rents.

SALON

The credit creation theory of banking — the only theory consistent with empirical evidence

7 Jan, 2015 at 14:08 | Posted in Economics | 10 Comments

In the process of making loaned money available in the borrower’s bank account, it was found that the bank did not transfer the money away from other internal or external accounts, resulting in a rejection of both the fractional reserve theory and the financial intermediation theory. Instead, it was found that the bank newly ‘invented’ the funds by crediting the borrower’s account with a deposit, although no such deposit had taken place. This is in line with the claims of the credit creation theory.

athink-biga1Thus it can now be said with confidence for the first time – possibly in the 5000 years’ history of banking – that it has been empirically demonstrated that each individual bank creates credit and money out of nothing, when it extends what is called a ‘bank loan’. The bank does not loan any existing money, but instead creates new money. The money supply is created as ‘fairy dust’ produced by the banks out of thin air. The implications are far-reaching.

Henceforth, economists need not rely on assertions concerning banks. We now know, based on empirical evidence, why banks are different, indeed unique — solving the longstanding puzzle posed by Fama (1985) and others — and different from both non-bank financial institutions and corporations: it is because they can individually create money out of nothing.

The empirical evidence shows that of the three theories of banking, it is the one that today has the least influence and that is being belittled in the literature that is supported by the empirical evidence. Furthermore, it is the theory which was widely held at the end of the 19th century and in the first three decades of the twentieth. It is sobering to realise that since the 1930s, economists have moved further and further away from the truth, instead of coming closer to it. This happened first via the half-truth of the fractional reserve theory and then reached the completely false and misleading financial intermediation theory that today is so dominant. Thus this paper has found evidence that there has been no progress in scientific knowledge in economics, finance and banking in the 20th century concerning one of the most important and fundamental facts for these disciplines. Instead, there has been a regressive development. The known facts were unlearned and have become unknown. This phenomenon deserves further research. For now it can be mentioned that this process of unlearning the facts of banking could not possibly have taken place without the leading economists of the day having played a significant role in it.

Richard A. Werner

Added: Indeed, there certainly has been a “regressive development.” Things that were known facts back in 1948 have somehow been unlearned and become unknown …
 

[h/t lasse]

The Mankiw-Piketty showdown at The ASSA Annual Meeting January 2015

6 Jan, 2015 at 14:40 | Posted in Economics | 3 Comments

Link here: http://t.co/6q9FlLJH2X

depew
Photo credit: Kyle Depew

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.