Neoliberalism — a self-serving con

31 Aug, 2014 at 22:46 | Posted in Politics & Society | 8 Comments

If neoliberalism were anything other than a self-serving con, whose gurus and think tanks were financed from the beginning by some of the richest people on earth … its apostles would have demanded, as a precondition for a society based on merit, that no one should start life with the unfair advantage of inherited wealth or economically-determined education. But they never believed in their own doctrine. Enterprise, as a result, quickly gave way to rent.


All this is ignored, and success or failure in the market economy are ascribed solely to the efforts of the individual. The rich are the new righteous, the poor are the new deviants, who have failed both economically and morally, and are now classified as social parasites.

The market was meant to emancipate us, offering autonomy and freedom. Instead it has delivered atomisation and loneliness. The workplace has been overwhelmed by a mad, Kafka-esque infrastructure of assessments, monitoring, measuring, surveillance and audits, centrally directed and rigidly planned, whose purpose is to reward the winners and punish the losers. It destroys autonomy, enterprise, innovation and loyalty and breeds frustration, envy and fear.

George Monbiot

Sampling error (student stuff)

31 Aug, 2014 at 16:06 | Posted in Statistics & Econometrics | Comments Off on Sampling error (student stuff)


Original sin in economics

30 Aug, 2014 at 13:32 | Posted in Theory of Science & Methodology | 1 Comment

Ever since the Enlightenment various economists had been seeking to mathematise the study of the economy. In this, at least prior to the early years of the twentieth century, economists keen to mathematise their discipline felt constrained in numerous ways, and not least by pressures by (non-social) natural scientists and influential peers to conform to the ‘standards’ and procedures of (non-social) natural science, and thereby abandon any idea of constructing an autonomous tradition of mathematical economics. Especially influential, in due course, was the classical reductionist programme, the idea that all mathematical disciplines should be reduced to or based on the model of physics, in particular on the strictly deterministic approach of mechanics, with its emphasis on methods of infinitesimal calculus …

quineHowever, in the early part of the twentieth century changes occurred in the inter-pretation of the very nature of mathe-matics, changes that caused the classical reductionist programme itself to fall into disarray. With the development of relativity theory and especially quantum theory, the image of nature as continuous came to be re-examined in particular, and the role of infinitesimal calculus, which had previously been regarded as having almost ubiquitous relevance within physics, came to be re-examined even within that domain.

The outcome, in effect, was a switch away from the long-standing emphasis on mathematics as an attempt to apply the physics model, and specifically the mechanics metaphor, to an emphasis on mathematics for its own sake.

Mathematics, especially through the work of David Hilbert, became increasingly viewed as a discipline properly concerned with providing a pool of frameworks for possible realities. No longer was mathematics seen as the language of (non-social) nature, abstracted from the study of the latter. Rather, it was conceived as a practice concerned with formulating systems comprising sets of axioms and their deductive consequences, with these systems in effect taking on a life of their own. The task of finding applications was henceforth regarded as being of secondary importance at best, and not of immediate concern.

This emergence of the axiomatic method removed at a stroke various hitherto insurmountable constraints facing those who would mathematise the discipline of economics. Researchers involved with mathematical projects in economics could, for the time being at least, postpone the day of interpreting their preferred axioms and assumptions. There was no longer any need to seek the blessing of mathematicians and physicists or of other economists who might insist that the relevance of metaphors and analogies be established at the outset. In particular it was no longer regarded as necessary, or even relevant, to economic model construction to consider the nature of social reality, at least for the time being. Nor, it seemed, was it possible for anyone to insist with any legitimacy that the formulations of economists conform to any specific model already found to be successful elsewhere (such as the mechanics model in physics). Indeed, the very idea of fixed metaphors or even interpretations, came to be rejected by some economic ‘modellers’ (albeit never in any really plausible manner).

The result was that in due course deductivism in economics, through morphing into mathematical deductivism on the back of developments within the discipline of mathematics, came to acquire a new lease of life, with practitioners (once more) potentially oblivious to any inconsistency between the ontological presuppositions of adopting a mathematical modelling emphasis and the nature of social reality. The consequent rise of mathematical deductivism has culminated in the situation we find today.

Tony Lawson

The Arrow-Debreu obsession

29 Aug, 2014 at 17:14 | Posted in Economics | 6 Comments

I’ve never yet been able to understand why the economics profession was/is so impressed by the Arrow-Debreu results. They establish that in an extremely abstract model of an economy, there exists a unique equilibrium with certain properties. The assumptions required to obtain the result make this economy utterly unlike anything in the real world. In effect, it tells us nothing at all.what if So why pay any attention to it? The attention, I suspect, must come from some prior fascination with the idea of competitive equilibrium, and a desire to see the world through that lens, a desire that is more powerful than the desire to understand the real world itself. This fascination really does hold a kind of deranging power over economic theorists, so powerful that they lose the ability to think in even minimally logical terms; they fail to distinguish necessary from sufficient conditions, and manage to overlook the issue of the stability of equilibria.

Mark Buchanan

Almost a century and a half after Léon Walras founded neoclassical general equilibrium theory, economists still have not been able to show that markets move economies to equilibria.

We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. After reading Buchanan’s article one however has to ask oneself — what good does that do?

As long as we cannot show, except under exceedingly special assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria — the value of general equilibrium theory is negligible. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming “Santa Claus” conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids, and general equilibrium economists ought to grow up.

Continuing to model a world full of agents behaving as economists — “often wrong, but never uncertain” — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away) is a gross misallocation of intellectual resources and time.

And then, of course, there is Sonnenschein-Mantel-Debreu!

So what? Why should we care about Sonnenschein-Mantel-Debreu?

Because  Sonnenschein-Mantel-Debreu ultimately explains why New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and “New Keynesian” microfounded macromodels are such bad substitutes for real macroeconomic analysis!

These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the Sonnenschein-Mantel-Debreu theorem points to. Representative agent models are — as I have argued at length here — rather an evasion whereby issues of distribution, coordination, heterogeneity — everything that really defines macroeconomics — are swept under the rug.

Instead of real maturity, we see that general equilibrium theory possesses only pseudo-maturity.kornai For the description of the economic system, mathematical economics has succeeded in constructing a formalized theoretical structure, thus giving an impression of maturity, but one of the main criteria of maturity, namely, verification, has hardly been satisfied. In comparison to the amount of work devoted to the construction of the abstract theory, the amount of effort which has been applied, up to now, in checking the assumptions and statements seems inconsequential.

Pedagogikämnets kärna äntligen funnen

29 Aug, 2014 at 10:32 | Posted in Theory of Science & Methodology | 1 Comment

I senaste numret av Pedagogisk Forskning i Sverige (2-3 2014) ger författaren till artikeln En pedagogisk relation mellan människa och häst. På väg mot en pedagogisk filosofisk utforskning av mellanrummet följande intressanta “programförklaring”:

Med en posthumanistisk ansats belyser och reflekterar jag över hur både människa och häst överskrider sina varanden och hur det öppnar upp ett mellanrum med dimensioner av subjektivitet, kroppslighet och ömsesidighet.

elite-daily-sleeping-studentOch så säger man att pedagogikämnet är i kris. Undrar varför …

Ricardo’s theory of comparative advantage in 60 seconds

29 Aug, 2014 at 08:19 | Posted in Economics | 3 Comments


Regeringens arbetslinje — låt arbetarna betala för krisen

28 Aug, 2014 at 20:22 | Posted in Economics, Politics & Society | Comments Off on Regeringens arbetslinje — låt arbetarna betala för krisen

Poängen är alltså att sätta press på lönerna. Vilket Anders Borg – trots upprepade förnekanden under sin tid som finansminister – faktiskt har erkänt både en och två gånger. ”Det blir naturligtvis tufft för de arbetslösa. Syftet är att öka trycket att söka och acceptera jobb”, sa Anders Borg till LO-tidningen hösten 2004. Och på ett SNS-seminarium om den socialdemokratiska höstbudgeten förklarade han vad detta ökade söktryck, detta ökade ”arbetsutbud”, skulle leda till i förlängningen: ”Så småningom värker sänkta ersättningsnivåer igenom systemet och det blir nya jobb eftersom lönebildningen påverkas, vilket leder till lägre löner.”


Enligt en IFAU-studie skriven av tre nationalekonomer med Lars Calmfors i spetsen har effekten också blivit den avsedda: ”Lönerna har hamnat på en lägre nivå än vad de annars skulle ha gjort till följd av regeringens jobbskatteavdrag och den mindre generösa a-kassan. Det har varit en mycket kontroversiell fråga där man inte så gärna från regeringens sida har velat säga att troliga effekter går via lönebildningen”, sa han till Ekot sommaren 2013, och tillade att man kunde tolka studien som att ”lönerna har hamnat i storleksordningen 3-4 procent lägre i dag än vad de annars skulle ha varit” – inte obetydligt alltså. Men ändå inte tillräckligt för Calmfors, som drog slutsatsen att ”löneskillnaderna är för små” i en artikel i DN våren 2014 …

Att svaret på arbetslöshetsproblemet stavas lägre löner och fler låglönejobb anses snarare självklart bland de nationalekonomer som dominerat debatten sedan 90-talet. Om det uppstår ett ”utbudsöverskott”, det vill säga arbetslöshet, beror det enligt den neoklassiska teorin helt sonika på att priset, det vill säga lönen, satts för högt.

Enligt ekonomer med keynesiansk lutning – som sällan får chansen att påverka jobbpolitiken nuförtiden – bygger dock den här teorin på en begränsad förståelse av arbetslösheten. Lars Pålsson Syll, professor i ekonomisk historia, doktor i nationalekonomi och professor i samhällskunskap vid Malmö högskola, menar till exempel att generella lönesänkningar ökar risken för att fler jobb går förlorade.

”Om ett företag eller en delbransch lyckas få billigare arbetskraft genom att sänka lönerna, ja då är det inget samhällsproblem”, säger han till mig. ”Problemet uppstår om lönesänkarstrategin blir allmänt förekommande. För då minskar den totala efterfrågan i ekonomin, och då blir arbetslösheten i slutändan ännu högre. Till det kommer en ökad ojämlikhet i inkomster och välfärd, vilket i sig få väldigt negativa effekter på sysselsättningen.”

Hur då? undrar jag.

Ja, titta bara på USA, föreslår Lars Pålsson Syll.

Det var just mot USA som svenska nationalekonomer riktade blicken i mitten av 90-talet, i en strävan att verklighetsförankra sina teorier. Under Bill Clintons presidentskap 1992–2000 föll nämligen den amerikanska arbetslösheten ner mot fyra procent, vilket ansågs bero på att ersättningsnivåerna var lägre, löneskillnaderna större och låglönejobben fler. I Sverige och resten av Europa hade däremot massarbetslösheten permanentats. Med USA som ideal sågade nationalekonomins nestor Assar Lindbeck den svenska modellen i Ekonomisk Debatt 1996: ”Generösa bidrag” hade skapat ”speciella arbetslöshetskulturer” och försvagat ”arbetslöshetens dämpande effekt på löneökningstakten, vilket begränsar efterfrågan på arbetskraft”, skrev han.

Lindbeck föreslog ett åtgärdspaket för att amerikanisera Sverige: Fler jobb i privat servicesektor, framdrivna genom sänkta arbetsgivaravgifter för ”lågproduktiva löntagare” och subventioner av hushållstjänster; men också mer ”hårdhänta” metoder som ”flexiblare relativlöner, mindre generösa arbetslöshetsunderstöd” och ”en urholkad lagstiftning om anställningstrygghet”. Så skulle landet komma på fötter.

Nationalekonomernas vurm för USA har dock försvunnit på senare år. Vilket kanske inte är så konstigt. Det ekonomiska under som tycktes bekräfta att vägen till full sysselsättning gick genom sänkt skatt, bantade socialförsäkringar och fler låglönejobb var i mångt och mycket ett luftslott. Den amerikanska tillväxten, visade det sig, byggde mest på en svällande kreditbubbla som sprack i mitten av 00-talet, och den ”lönespridning” som sades vara bra för ekonomin bäddade istället för en finanskris, då arbetarna skuldsatte sig allt mer i ett försök att kompensera de dalande lönerna.

Kent Werne

Keynes and the Stockholm School

27 Aug, 2014 at 18:40 | Posted in Economics | 1 Comment

The Stockholm method seems to me exactly the right way to explain business-cycle downturns. In normal times, there is a rough – certainly not perfect, but good enough — correspondence of expectations among agents. That correspondence of expectations implies that the individual plans contingent on those expectations will be more or less compatible with one another. Surprises happen; here and there people are disappointed and regret past decisions, but, on the whole, they are able to adjust as needed to muddle through. There is usually enough flexibility in a system to allow most people to adjust their plans in response to unforeseen circumstances, so that the disappointment of some expectations doesn’t become contagious, causing a systemic crisis.
But when there is some sort of major shock – and it can only be a shock if it is unforeseen – the system may not be able to adjust. Instead, the disappointment of expectations becomes contagious. If my customers aren’t able to sell their products, I may not be able to sell mine. Expectations are like networks. If there is a breakdown at some point in the network, the whole network may collapse or malfunction. Because expectations and plans fit together in interlocking networks, it is possible that even a disturbance at one point in the network can cascade over an increasingly wide group of agents, leading to something like a system-wide breakdown, a financial crisis or a depression.

But the “problem” with the Stockholm method was that it was open-ended. It could offer only “a wide variety” of “model sequences,” without specifying a determinate solution. It was just this gap in the Stockholm approach that Keynes was able to fill. He provided a determinate equilibrium, “the limit to which the Stockholm model sequences would move, rather than the time path they follow to get there.” A messy, but insightful, approach to explaining the phenomenon of downward spirals in economic activity coupled with rising unemployment was cast aside in favor of the neater, simpler approach of Keynes …

Unfortunately, that is still the case today. Open-ended models of the sort that the Stockholm School tried to develop still cannot compete with the RBC and DSGE models that have displaced IS-LM and now dominate modern macroeconomics. The basic idea that modern economies form networks, and that networks have properties that are not reducible to just the nodes forming them has yet to penetrate the trained intuition of modern macroeconomists. Otherwise, how would it have been possible to imagine that a macroeconomic model could consist of a single representative agent? And just because modern macroeconomists have expanded their models to include more than a single representative agent doesn’t mean that the intellectual gap evidenced by the introduction of representative-agent models into macroeconomic discourse has been closed.

Uneasy Money

How to prove labour market discrimination

27 Aug, 2014 at 11:40 | Posted in Theory of Science & Methodology | 1 Comment

A 2005 governmental inquiry led to a trial period involving anonymous job applications in seven public sector workplaces during 2007. In doing so, the public sector aims to improve the recruitment process and to increase the ethnic diversity among its workforce. There is evidence to show that gender and ethnicity have an influence in the hiring process although this is considered as discrimination by current legislation …

shutterstock_98546318-390x285The process of ‘depersonalising’ job applications is to make these applications anonymous. In the case of the Gothenburg trial, certain information about the applicant – such as name, sex, country of origin or other identifiable traits of ethnicity and gender – is hidden during the first phase of the job application procedure. The recruiting managers therefore do not see the full content of applications when deciding on whom to invite for interview. Once a candidate has been selected for interview, this information can then be seen.

The trial involving job applications of this nature in the city of Gothenburg is so far the most extensive in Sweden. For this reason, the Institute for Labour Market Policy Evaluation (IFAU) has carried out an evaluation of the impact of anonymous job applications in Gothenburg …

The data used in the IFAU study derive from three districts in Gothenburg … Information on the 3,529 job applicants and a total of 109 positions were collected from all three districts …

A difference-in-difference model was used to test the findings and to estimate the effects in the outcome variables: whether a difference emerges regarding an invitation to interview and job offers in relation to gender and ethnicity in the case of anonymous job applications compared with traditional application procedures.

For job openings where anonymous job applications were applied, the IFAU study reveals that gender and the ethnic origin of the applicant do not affect the probability of being invited for interview. As would be expected from previous research, these factors do have an impact when compared with recruitment processes using traditional application procedures where all the information on the applicant, such as name, sex, country of origin or other identifiable traits of ethnicity and gender, is visible during the first phase of the hiring process. As a result, anonymous applications are estimated to increase the probability of being interviewed regardless of gender and ethnic origin, showing an increase of about 8% for both non-western migrant workers and women.

Paul Andersson/EWCO


Mainstream economists — gung-ho supporters of neoliberal globalization

26 Aug, 2014 at 22:06 | Posted in Economics | Comments Off on Mainstream economists — gung-ho supporters of neoliberal globalization

gung-ho-the-story-of-carlsons-makin-island-raiders-movie-poster-1943-1010545148Another obstruction comes from the mainstream economics profession that strongly influences public understanding of and discourse about globalization. The economics profession has been a gung-ho supporter of neoliberal globalization, using the rhetoric of free trade. It advocated the policies of the Washington Consensus that were implemented by the IMF and World Bank in the 1980s and 1990s, and it remains one-hundred percent intellectually committed to neoliberal globalization. However, because globalization inevitably creates global imbalances which are potentially politically challenging, it is necessary to sanitize them by arguing they do no harm and do not undermine the benefits of neoliberal globalization … The profession promotes hypotheses that sanitize the imbalances, while ignoring those that paint the imbalances as the product of a toxic form of globalization.

Moving a globalization reform agenda requires getting the narrative and understanding right. That is the practical political economy significance of the arguments presented in this paper.

Thomas Palley

All that glitters is not gold

26 Aug, 2014 at 17:44 | Posted in Economics | Comments Off on All that glitters is not gold

Eighty years ago Keynes could congratulate Great Britain on finally having got rid of the biggest ”barbarous relic” of his time – the gold standard. He lamented that

advocates of the ancient standard do not observe how remote it now is from the spirit and the requirement of the age … [T]he long age of Commodity Money has at last passed away before the age of Representative Money. Gold has ceased to be a coin, a hoard, a tangible claim to wealth … It has become a much more abstract thing – just a standard of value; and it only keeps this nominal status by being handed round from time to time in quite small quantities amongst a group of Central Banks.

goldEnding the use of fiat money guaranteed by promises for currencies once more backed by gold is not the way out of the present economic crisis. Far from being the sole prophylactic against the alleged problems of fiat money, as the “gold bugs” maintain, a return to gold would only make things far worse. So yours truly – just as Keynes did – most certainly reject any proposals for restoring the gold standard.

The “gold bugs” seem to forget that we actually have tried the gold standard before – in the era more or less between 1870 and 1930 – and with disastrous results!

Implementing a new gold standard today would only lead to a generally falling price level. Sounds great? If you think so, read what Keynes wrote already eighty years ago in Essays in Persuasion:

Of course, a fall in prices, which is the the same thing as a rise in the value of claims on money, means that real wealth is transferred from the debtor in favour of the creditor, so that a larger proportion of the real assets is represented by the claims of the depositor, and a smaller proportion belongs to the nominal owner of the asset who has borrowed in order to buy.

Allowing this debt deflation process – the analysis of which was later developed by Irving Fisher and Hyman Minsky – would land us in a situation where output and wages would fall and unemployment and the real burden of debt would increase. The only winners would probably be banks and financial institutes.

So why would anyone want to reinstate a gold standard? The best surmise is probably that it’s a question of ideology and politics. Libertarians and market fundamentalists that advocate a return to gold, want to restrict the possibilities of governments to intervene in the economy and – even harder than with “independent” central banks – force countries to pursue restrictive economic policies that at all costs keep inflation down.

Still not convinced of why a return to gold is a bad idea? Then, at least, remember what Keynes wrote in The Economic Consequences of Mr Churchill (1925):

We stand midway between two theories of economic society. The one theory maintains that wages should be fixed by reference to what is ’fair’ and ’reasonable’ as between classes. The other theory–the theory of the economic juggernaut–is that wages should be settled by economic pressure, otherwise called ’hard facts’, and that our vast machine should crash along, with regard only to its equilibrium as a whole, and without attention to the chance consequences of the journey to individual groups. The gold standard, with its dependence on pure chance, its faith in the ’automatic adjustments’, and its general regardlessness of social detail, is an essential emblem and idol of those who sit in the top tier of the machine. I think that they are immensely rash… in their comfortable belief that nothing really serious ever happens. Nine times out of ten, nothing really does happen–merely a little distress to individuals or to groups. But we run a risk of the tenth time (and stupid into the bargain), if we continue to apply the principles of an economics, which was worked out on the hypothesis of laissez-faire and free competition, to a society which is rapidly abandoning these hypotheses.

gold_bugSo, next time you want to come up with some new idea on how to solve our economic problems with a magic gold bullet, remember new economic thinking starts with reading old books! Why not start with the best there are – those written by John Maynard Keynes.

All cried out

26 Aug, 2014 at 08:15 | Posted in Varia | Comments Off on All cried out


That voice still moves me everytime I hear it

Great Expectations

25 Aug, 2014 at 17:32 | Posted in Varia | Comments Off on Great Expectations


Piketty — ett rött skynke på DN:s ledarsida

25 Aug, 2014 at 12:42 | Posted in Economics | 1 Comment

För några veckor sedan kunde vi läsa en söndagskrönika i Sydsvenskan — författad av förre ledarskribenten Per T Ohlsson — om Thomas Piketty’s Capital in the Twenty-First Century. Följande lilla stycke var belysande:

pikEn svaghet med Capital in the Twenty-First Century är emellertid att Pikettys alarmistiska slutsatser inte bygger på historiska data, utan på teoretiska modeller, “lagar”, som han själv har konstruerat utifrån högst diskutabla antaganden om sparande och tillväxt. Dessutom bortser Piketty från faktorer som bromsar den utbredning av ojämlikhet som han finner ödesbestämd, till exempel utbildning och ny teknik. Flera tunga ekonomer har påpekat detta, bland dem svensken Per Krusell tillsammans med kollegan Tony Smith från Yale.

Och idag kan vi på DN:s ledarsida läsa att de slutsatser om ekonomins framtid som Thomas Piketty lägger fram i sin bok ”Capital in the twenty-first Century,” vid en närmare granskning inte stämmer med verkligheten.

Vad baserar sig denna slutsats på? Jo, samma källa som Per T Ohlsson — Per Krusell.

Som läsare av denna blog kunnat konstatera under sommarens livliga forskardebatt kring Pikettys bok — se t. ex. här, här och här — är Krusells illa underbyggda —  och i vissa avseenden direkt felaktiga — modellantaganden långt ifrån med verkligheten överensstämmande. Krusell har — som den amerikanske nationalekonomen Brad DeLong övertygande visat — en minst sagt dålig verklighetsförankring vad avser de numeriska värden han laborerar med i sina modellbaserade försök att vederlägga Piketty:

Reality-Check-2As time passes, it seems to me that a larger and larger fraction of Piketty’s critics are making arguments that really make no sense at all–that I really do not understand how people can believe them, or why anybody would think that anybody else would believe them. Today we have Per Krusell and Tony Smith assuming that the economy-wide capital depreciation rate δ is not 0.03 or 0.05 but 0.1–and it does make a huge difference…

Per Krusell and Tony Smith: Piketty’s ‘Second Law of Capitalism’ vs. standard macro theory:

“Piketty’s forecast does not rest primarily on an extrapolation of recent trends … [which] is what one might have expected, given that so much of the book is devoted to digging up and displaying reliable time series…. Piketty’s forecast rests primarily on economic theory. We take issue…. Two ‘fundamental laws’, as Piketty dubs them… asserts that K/Y will, in the long run, equal s[net]/g…. Piketty… argues… s[net]/g… will rise rapidly in the future…. Neither the textbook Solow model nor a ‘microfounded’ model of growth predicts anything like the drama implied by Piketty’s theory…. Theory suggests that the wealth–income ratio would increase only modestly as growth falls …”

And if we go looking for why they believe that “theory suggests that the wealth–income ratio would increase only modestly as growth falls”, we find:

Per Krusell and Tony Smith: Is Piketty’s “Second Law of Capitalism” Fundamental? :

“In the textbook model … the capital-to-income ratio is not s[net]/g but rather s[gross]/(g+δ), where δ is the rate at which capital depreciates. With the textbook formula, growth approaching zero would increase the capital-output ratio but only very marginally; when growth falls all the way to zero, the denominator would not go to zero but instead would go from, say 0.12–with g around 0.02 and δ=0.1 as reasonable estimates–to 0.1.”

But with an economy-wide capital output ratio of 4-6 and a depreciation rate of 0.1, total depreciation–the gap between NDP and GDP–is not its actual 15% of GDP, but rather 40%-60% of GDP. If the actual depreciation rate were what Krussall and Smith say it is, fully half of our economy would be focused on replacing worn-out capital.

It isn’t … That makes no sense at all.

For the entire economy, one picks a depreciation rate of 0.02 or 0.03 or 0.05, rather than 0.10.

I cannot understand how anybody who has ever looked at the NIPA, or thought about what our capital stock is made of, would ever get the idea that the economy-wide depreciation rate δ=0.1.

And if you did think that for an instant, you would then recognize that you have just committed yourself to the belief that NDP is only half of GDP, and nobody thinks that–except Krusell and Smith. Why do they think that? Where did their δ=0.1 estimate come from? Why didn’t they immediately recognize that that deprecation estimate was in error, and correct it?

Why would anyone imagine that any growth model should ever be calibrated to such an extraordinarily high depreciation rate? …

I really do not understand what is going on here at all…

Brad DeLong

När det kommer till kritan visar det sig att de “tunga” invändningarna mot Piketty i själva verket väger mycket lätt …

Thomas Piketty har väckt viktiga frågor om hur vår ekonomi bäst tjänar mänskligheten, och han ger — till skillnad från DN:s ledare, som mot bättre vetande väljer att luta sig mot synpunkter framförda av en ekonom som sedan länge bemötts i forskarvärlden och visats vara dåligt underbyggda — meningsfulla svar på dessa frågor.

Och även om man inte är är övertygad om riktigheten av Pikettys svar, tål det att komma ihåg vad Jesper Roine skriver apropå sin nyligen på svenska utkomna sammanfattning av Pikettys bok:

Förhoppningen med att skriva denna sammanfattning har varit att den som inte hunnit läsa hela det engelska (eller franska) originalet snabbt ska kunna orientera sig kring resonemangen men framförallt att den ska locka fler att läsa Pikettys bok och fundera på dessa frågor. Som Larry Summers mycket träffande uttryckte sig i en kommentar till Pikettys bok: ”Böcker som ger det slutgiltiga svaret på en fråga är viktiga. Böcker som ställer nya frågor är viktigare”.

[För ekonomer: Krusell och andra kritiker av Piketty verkar tro att frågan handlar om teori och att Piketty på något sätt skulle ha felspecificerat standardtillväxtmodellen. Piketty talar i själva verket inte speciellt mycket om (Solows) standardtillväxtmodell i boken, men låt oss bara för skojs skull se vad en sådan neoklassisk tillväxtmodell säger. Anta att vi har en produktionsfunktion med homogenitetsgrad ett och obegränsad substituerbarhet — exempelvis en standard Cobb-Douglas produktionsfunktion (med A som en given produktivitetsparameter, och k som kvoten mellan kapital och arbete, K/L) y = Akα , med konstant investering λ av y och en konstant deprecieringskvot δ av “kapital per arbete” k, där ackumulationskvoten för k, Δk = λy– δk, är lika med Δk = λAkα– δk. I “steady state” (*) har vi λAk*α = δk*, vilket ger λ/δ = k*/y* och k* = (λA/δ)1/(1-α). Sätter vi in detta värdet av k* i produktions funktionen, får vi en “steady state” output per arbete y* = Ak*α= A1/(1-α)(λ/δ))α/(1-α). Under antagande av att vi har en exogen “Harrod-neutral” teknologiutveckling som ökar y med tillväxttakten g (under antagande av nolltillväxt i arbete och y och k a fortiori omdefinierat som respektive y/A och k/A, vilket ger produktionsfunktionen y = kα) får vi dk/dt = λy – (g + δ)k, vilket i Cobb-Douglas fallet dk/dt = λkα– (g + δ)k, med “steady state” värdet k* = (λ/(g + δ))1/(1-α) och kapital-outputkvoten k*/y* = k*/k*α = λ/(g + δ). Om vi använder oss av den av Piketty föredragna modellen där kapital och output ges i nettotermer (efter depreciering) måste vi ändra det senare uttrycket till k*/y* = k*/k*α = λ/(g + λδ). Nu gör Piketty förutsägelsen att g faller och att detta ökar kapital-outputkvoten. Låt oss säga att δ = 0.03, λ = 0.1 och g = 0.03 initialt. Detta ger en kapital-outputkvot på ca 3. Om g faller till 0.01 ökar kvoten till ca 7.7. Vi får analoga resultat om vi använder en s.k. “CES produktionsfunktion” med en substitutionselasticitet σ > 1. Med σ = 1.5, ökar kapitalandelen av outputen från 0.2 till 0.36 om “förmögenhets-inkomstkvoten” går från 2.5 to 5, vilket enligt Piketty är vad som faktiskt hände i de rika länderna under den senaste fyrtioårsperioden.]

Ricardian equivalence and DSGE models

24 Aug, 2014 at 18:16 | Posted in Economics | 2 Comments

out of the frying

Benchmark DSGE models have paid little attention to the role of fiscal policy, therefore minimising any possible interaction of fiscal policies with monetary policy. This has been partly because of the assumption of Ricardian equivalence. As a result, the distribution of taxes across time become irrelevant and aggregate financial wealth does not matter for the behavior of agents or for the dynamics of the economy because bonds do not represent net real wealth for households.

Incorporating more meaningfully the role of fiscal policies requires abandoning frameworks with the Ricardian equivalence. The question is how to break the Ricardian equivalence? Two possibilities are available. The first is to move to an overlapping generations framework and the second (which has been the most common way of handling the problem) is to rely on an infinite-horizon model with a type of liquidity constrained agents (eg “rule of thumb agents”).

Camillo Tovar

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

There is, of course, no reason for us to believe in that fairy-tale. Ricardo himself — mirabile dictu — didn’t believe in Ricardian equivalence. In “Essay on the Funding System” (1820) he wrote:

But the people who paid the taxes never so estimate them, and therefore do not manage their private affairs accordingly. We are too apt to think that the war is burdensome only in proportion to what we are at the moment called to pay for it in taxes, without reflecting on the probable duration of such taxes. It would be difficult to convince a man possessed of £20,000, or any other sum, that a perpetual payment of £50 per annum was equally burdensome with a single tax of £1000.

And as one Nobel Prize laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter 

So, I totally agree that macroeconomic models have to abandon Ricardian equivalence nonsense. But replacing it with “overlapping generations” and “infinite-horizon” models — isn’t that — in terms of realism and relevance — just getting out of the frying pan into the fire?

Next Page »

Blog at
Entries and Comments feeds.