Neoliberalism — a self-serving con

31 August, 2014 at 22:46 | Posted in Politics & Society | 8 Comments

If neoliberalism were anything other than a self-serving con, whose gurus and think tanks were financed from the beginning by some of the richest people on earth … its apostles would have demanded, as a precondition for a society based on merit, that no one should start life with the unfair advantage of inherited wealth or economically-determined education. But they never believed in their own doctrine. Enterprise, as a result, quickly gave way to rent.

economy_and_neoliberalism_1729695

All this is ignored, and success or failure in the market economy are ascribed solely to the efforts of the individual. The rich are the new righteous, the poor are the new deviants, who have failed both economically and morally, and are now classified as social parasites.

The market was meant to emancipate us, offering autonomy and freedom. Instead it has delivered atomisation and loneliness. The workplace has been overwhelmed by a mad, Kafka-esque infrastructure of assessments, monitoring, measuring, surveillance and audits, centrally directed and rigidly planned, whose purpose is to reward the winners and punish the losers. It destroys autonomy, enterprise, innovation and loyalty and breeds frustration, envy and fear.

George Monbiot

Advertisements

Sampling error (student stuff)

31 August, 2014 at 16:06 | Posted in Statistics & Econometrics | Comments Off on Sampling error (student stuff)

 

Original sin in economics

30 August, 2014 at 13:32 | Posted in Theory of Science & Methodology | 1 Comment

Ever since the Enlightenment various economists had been seeking to mathematise the study of the economy. In this, at least prior to the early years of the twentieth century, economists keen to mathematise their discipline felt constrained in numerous ways, and not least by pressures by (non-social) natural scientists and influential peers to conform to the ‘standards’ and procedures of (non-social) natural science, and thereby abandon any idea of constructing an autonomous tradition of mathematical economics. Especially influential, in due course, was the classical reductionist programme, the idea that all mathematical disciplines should be reduced to or based on the model of physics, in particular on the strictly deterministic approach of mechanics, with its emphasis on methods of infinitesimal calculus …

quineHowever, in the early part of the twentieth century changes occurred in the inter-pretation of the very nature of mathe-matics, changes that caused the classical reductionist programme itself to fall into disarray. With the development of relativity theory and especially quantum theory, the image of nature as continuous came to be re-examined in particular, and the role of infinitesimal calculus, which had previously been regarded as having almost ubiquitous relevance within physics, came to be re-examined even within that domain.

The outcome, in effect, was a switch away from the long-standing emphasis on mathematics as an attempt to apply the physics model, and specifically the mechanics metaphor, to an emphasis on mathematics for its own sake.

Mathematics, especially through the work of David Hilbert, became increasingly viewed as a discipline properly concerned with providing a pool of frameworks for possible realities. No longer was mathematics seen as the language of (non-social) nature, abstracted from the study of the latter. Rather, it was conceived as a practice concerned with formulating systems comprising sets of axioms and their deductive consequences, with these systems in effect taking on a life of their own. The task of finding applications was henceforth regarded as being of secondary importance at best, and not of immediate concern.

This emergence of the axiomatic method removed at a stroke various hitherto insurmountable constraints facing those who would mathematise the discipline of economics. Researchers involved with mathematical projects in economics could, for the time being at least, postpone the day of interpreting their preferred axioms and assumptions. There was no longer any need to seek the blessing of mathematicians and physicists or of other economists who might insist that the relevance of metaphors and analogies be established at the outset. In particular it was no longer regarded as necessary, or even relevant, to economic model construction to consider the nature of social reality, at least for the time being. Nor, it seemed, was it possible for anyone to insist with any legitimacy that the formulations of economists conform to any specific model already found to be successful elsewhere (such as the mechanics model in physics). Indeed, the very idea of fixed metaphors or even interpretations, came to be rejected by some economic ‘modellers’ (albeit never in any really plausible manner).

The result was that in due course deductivism in economics, through morphing into mathematical deductivism on the back of developments within the discipline of mathematics, came to acquire a new lease of life, with practitioners (once more) potentially oblivious to any inconsistency between the ontological presuppositions of adopting a mathematical modelling emphasis and the nature of social reality. The consequent rise of mathematical deductivism has culminated in the situation we find today.

Tony Lawson

The Arrow-Debreu obsession

29 August, 2014 at 17:14 | Posted in Economics | 6 Comments

I’ve never yet been able to understand why the economics profession was/is so impressed by the Arrow-Debreu results. They establish that in an extremely abstract model of an economy, there exists a unique equilibrium with certain properties. The assumptions required to obtain the result make this economy utterly unlike anything in the real world. In effect, it tells us nothing at all.what if So why pay any attention to it? The attention, I suspect, must come from some prior fascination with the idea of competitive equilibrium, and a desire to see the world through that lens, a desire that is more powerful than the desire to understand the real world itself. This fascination really does hold a kind of deranging power over economic theorists, so powerful that they lose the ability to think in even minimally logical terms; they fail to distinguish necessary from sufficient conditions, and manage to overlook the issue of the stability of equilibria.

Mark Buchanan

Almost a century and a half after Léon Walras founded neoclassical general equilibrium theory, economists still have not been able to show that markets move economies to equilibria.

We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. After reading Buchanan’s article one however has to ask oneself — what good does that do?

As long as we cannot show, except under exceedingly special assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria — the value of general equilibrium theory is negligible. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming “Santa Claus” conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids, and general equilibrium economists ought to grow up.

Continuing to model a world full of agents behaving as economists — “often wrong, but never uncertain” — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away) is a gross misallocation of intellectual resources and time.

And then, of course, there is Sonnenschein-Mantel-Debreu!

So what? Why should we care about Sonnenschein-Mantel-Debreu?

Because  Sonnenschein-Mantel-Debreu ultimately explains why New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and “New Keynesian” microfounded macromodels are such bad substitutes for real macroeconomic analysis!

These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the Sonnenschein-Mantel-Debreu theorem points to. Representative agent models are — as I have argued at length here — rather an evasion whereby issues of distribution, coordination, heterogeneity — everything that really defines macroeconomics — are swept under the rug.

Instead of real maturity, we see that general equilibrium theory possesses only pseudo-maturity.kornai For the description of the economic system, mathematical economics has succeeded in constructing a formalized theoretical structure, thus giving an impression of maturity, but one of the main criteria of maturity, namely, verification, has hardly been satisfied. In comparison to the amount of work devoted to the construction of the abstract theory, the amount of effort which has been applied, up to now, in checking the assumptions and statements seems inconsequential.

Pedagogikämnets kärna äntligen funnen

29 August, 2014 at 10:32 | Posted in Theory of Science & Methodology | 1 Comment

I senaste numret av Pedagogisk Forskning i Sverige (2-3 2014) ger författaren till artikeln En pedagogisk relation mellan människa och häst. På väg mot en pedagogisk filosofisk utforskning av mellanrummet följande intressanta “programförklaring”:

Med en posthumanistisk ansats belyser och reflekterar jag över hur både människa och häst överskrider sina varanden och hur det öppnar upp ett mellanrum med dimensioner av subjektivitet, kroppslighet och ömsesidighet.

elite-daily-sleeping-studentOch så säger man att pedagogikämnet är i kris. Undrar varför …

Ricardo’s theory of comparative advantage in 60 seconds

29 August, 2014 at 08:19 | Posted in Economics | 3 Comments

 

Regeringens arbetslinje — låt arbetarna betala för krisen

28 August, 2014 at 20:22 | Posted in Economics, Politics & Society | Comments Off on Regeringens arbetslinje — låt arbetarna betala för krisen

Poängen är alltså att sätta press på lönerna. Vilket Anders Borg – trots upprepade förnekanden under sin tid som finansminister – faktiskt har erkänt både en och två gånger. ”Det blir naturligtvis tufft för de arbetslösa. Syftet är att öka trycket att söka och acceptera jobb”, sa Anders Borg till LO-tidningen hösten 2004. Och på ett SNS-seminarium om den socialdemokratiska höstbudgeten förklarade han vad detta ökade söktryck, detta ökade ”arbetsutbud”, skulle leda till i förlängningen: ”Så småningom värker sänkta ersättningsnivåer igenom systemet och det blir nya jobb eftersom lönebildningen påverkas, vilket leder till lägre löner.”

nyberg

Enligt en IFAU-studie skriven av tre nationalekonomer med Lars Calmfors i spetsen har effekten också blivit den avsedda: ”Lönerna har hamnat på en lägre nivå än vad de annars skulle ha gjort till följd av regeringens jobbskatteavdrag och den mindre generösa a-kassan. Det har varit en mycket kontroversiell fråga där man inte så gärna från regeringens sida har velat säga att troliga effekter går via lönebildningen”, sa han till Ekot sommaren 2013, och tillade att man kunde tolka studien som att ”lönerna har hamnat i storleksordningen 3-4 procent lägre i dag än vad de annars skulle ha varit” – inte obetydligt alltså. Men ändå inte tillräckligt för Calmfors, som drog slutsatsen att ”löneskillnaderna är för små” i en artikel i DN våren 2014 …

Att svaret på arbetslöshetsproblemet stavas lägre löner och fler låglönejobb anses snarare självklart bland de nationalekonomer som dominerat debatten sedan 90-talet. Om det uppstår ett ”utbudsöverskott”, det vill säga arbetslöshet, beror det enligt den neoklassiska teorin helt sonika på att priset, det vill säga lönen, satts för högt.

Enligt ekonomer med keynesiansk lutning – som sällan får chansen att påverka jobbpolitiken nuförtiden – bygger dock den här teorin på en begränsad förståelse av arbetslösheten. Lars Pålsson Syll, professor i ekonomisk historia, doktor i nationalekonomi och professor i samhällskunskap vid Malmö högskola, menar till exempel att generella lönesänkningar ökar risken för att fler jobb går förlorade.

”Om ett företag eller en delbransch lyckas få billigare arbetskraft genom att sänka lönerna, ja då är det inget samhällsproblem”, säger han till mig. ”Problemet uppstår om lönesänkarstrategin blir allmänt förekommande. För då minskar den totala efterfrågan i ekonomin, och då blir arbetslösheten i slutändan ännu högre. Till det kommer en ökad ojämlikhet i inkomster och välfärd, vilket i sig få väldigt negativa effekter på sysselsättningen.”

Hur då? undrar jag.

Ja, titta bara på USA, föreslår Lars Pålsson Syll.

Det var just mot USA som svenska nationalekonomer riktade blicken i mitten av 90-talet, i en strävan att verklighetsförankra sina teorier. Under Bill Clintons presidentskap 1992–2000 föll nämligen den amerikanska arbetslösheten ner mot fyra procent, vilket ansågs bero på att ersättningsnivåerna var lägre, löneskillnaderna större och låglönejobben fler. I Sverige och resten av Europa hade däremot massarbetslösheten permanentats. Med USA som ideal sågade nationalekonomins nestor Assar Lindbeck den svenska modellen i Ekonomisk Debatt 1996: ”Generösa bidrag” hade skapat ”speciella arbetslöshetskulturer” och försvagat ”arbetslöshetens dämpande effekt på löneökningstakten, vilket begränsar efterfrågan på arbetskraft”, skrev han.

Lindbeck föreslog ett åtgärdspaket för att amerikanisera Sverige: Fler jobb i privat servicesektor, framdrivna genom sänkta arbetsgivaravgifter för ”lågproduktiva löntagare” och subventioner av hushållstjänster; men också mer ”hårdhänta” metoder som ”flexiblare relativlöner, mindre generösa arbetslöshetsunderstöd” och ”en urholkad lagstiftning om anställningstrygghet”. Så skulle landet komma på fötter.

Nationalekonomernas vurm för USA har dock försvunnit på senare år. Vilket kanske inte är så konstigt. Det ekonomiska under som tycktes bekräfta att vägen till full sysselsättning gick genom sänkt skatt, bantade socialförsäkringar och fler låglönejobb var i mångt och mycket ett luftslott. Den amerikanska tillväxten, visade det sig, byggde mest på en svällande kreditbubbla som sprack i mitten av 00-talet, och den ”lönespridning” som sades vara bra för ekonomin bäddade istället för en finanskris, då arbetarna skuldsatte sig allt mer i ett försök att kompensera de dalande lönerna.

Kent Werne

Keynes and the Stockholm School

27 August, 2014 at 18:40 | Posted in Economics | 1 Comment

The Stockholm method seems to me exactly the right way to explain business-cycle downturns. In normal times, there is a rough – certainly not perfect, but good enough — correspondence of expectations among agents. That correspondence of expectations implies that the individual plans contingent on those expectations will be more or less compatible with one another. Surprises happen; here and there people are disappointed and regret past decisions, but, on the whole, they are able to adjust as needed to muddle through. There is usually enough flexibility in a system to allow most people to adjust their plans in response to unforeseen circumstances, so that the disappointment of some expectations doesn’t become contagious, causing a systemic crisis.
gunnar-myrdal
But when there is some sort of major shock – and it can only be a shock if it is unforeseen – the system may not be able to adjust. Instead, the disappointment of expectations becomes contagious. If my customers aren’t able to sell their products, I may not be able to sell mine. Expectations are like networks. If there is a breakdown at some point in the network, the whole network may collapse or malfunction. Because expectations and plans fit together in interlocking networks, it is possible that even a disturbance at one point in the network can cascade over an increasingly wide group of agents, leading to something like a system-wide breakdown, a financial crisis or a depression.

But the “problem” with the Stockholm method was that it was open-ended. It could offer only “a wide variety” of “model sequences,” without specifying a determinate solution. It was just this gap in the Stockholm approach that Keynes was able to fill. He provided a determinate equilibrium, “the limit to which the Stockholm model sequences would move, rather than the time path they follow to get there.” A messy, but insightful, approach to explaining the phenomenon of downward spirals in economic activity coupled with rising unemployment was cast aside in favor of the neater, simpler approach of Keynes …

Unfortunately, that is still the case today. Open-ended models of the sort that the Stockholm School tried to develop still cannot compete with the RBC and DSGE models that have displaced IS-LM and now dominate modern macroeconomics. The basic idea that modern economies form networks, and that networks have properties that are not reducible to just the nodes forming them has yet to penetrate the trained intuition of modern macroeconomists. Otherwise, how would it have been possible to imagine that a macroeconomic model could consist of a single representative agent? And just because modern macroeconomists have expanded their models to include more than a single representative agent doesn’t mean that the intellectual gap evidenced by the introduction of representative-agent models into macroeconomic discourse has been closed.

Uneasy Money

How to prove labour market discrimination

27 August, 2014 at 11:40 | Posted in Theory of Science & Methodology | 1 Comment

A 2005 governmental inquiry led to a trial period involving anonymous job applications in seven public sector workplaces during 2007. In doing so, the public sector aims to improve the recruitment process and to increase the ethnic diversity among its workforce. There is evidence to show that gender and ethnicity have an influence in the hiring process although this is considered as discrimination by current legislation …

shutterstock_98546318-390x285The process of ‘depersonalising’ job applications is to make these applications anonymous. In the case of the Gothenburg trial, certain information about the applicant – such as name, sex, country of origin or other identifiable traits of ethnicity and gender – is hidden during the first phase of the job application procedure. The recruiting managers therefore do not see the full content of applications when deciding on whom to invite for interview. Once a candidate has been selected for interview, this information can then be seen.

The trial involving job applications of this nature in the city of Gothenburg is so far the most extensive in Sweden. For this reason, the Institute for Labour Market Policy Evaluation (IFAU) has carried out an evaluation of the impact of anonymous job applications in Gothenburg …

The data used in the IFAU study derive from three districts in Gothenburg … Information on the 3,529 job applicants and a total of 109 positions were collected from all three districts …

A difference-in-difference model was used to test the findings and to estimate the effects in the outcome variables: whether a difference emerges regarding an invitation to interview and job offers in relation to gender and ethnicity in the case of anonymous job applications compared with traditional application procedures.

For job openings where anonymous job applications were applied, the IFAU study reveals that gender and the ethnic origin of the applicant do not affect the probability of being invited for interview. As would be expected from previous research, these factors do have an impact when compared with recruitment processes using traditional application procedures where all the information on the applicant, such as name, sex, country of origin or other identifiable traits of ethnicity and gender, is visible during the first phase of the hiring process. As a result, anonymous applications are estimated to increase the probability of being interviewed regardless of gender and ethnic origin, showing an increase of about 8% for both non-western migrant workers and women.

Paul Andersson/EWCO

 

Mainstream economists — gung-ho supporters of neoliberal globalization

26 August, 2014 at 22:06 | Posted in Economics | Comments Off on Mainstream economists — gung-ho supporters of neoliberal globalization

gung-ho-the-story-of-carlsons-makin-island-raiders-movie-poster-1943-1010545148Another obstruction comes from the mainstream economics profession that strongly influences public understanding of and discourse about globalization. The economics profession has been a gung-ho supporter of neoliberal globalization, using the rhetoric of free trade. It advocated the policies of the Washington Consensus that were implemented by the IMF and World Bank in the 1980s and 1990s, and it remains one-hundred percent intellectually committed to neoliberal globalization. However, because globalization inevitably creates global imbalances which are potentially politically challenging, it is necessary to sanitize them by arguing they do no harm and do not undermine the benefits of neoliberal globalization … The profession promotes hypotheses that sanitize the imbalances, while ignoring those that paint the imbalances as the product of a toxic form of globalization.

Moving a globalization reform agenda requires getting the narrative and understanding right. That is the practical political economy significance of the arguments presented in this paper.

Thomas Palley

All that glitters is not gold

26 August, 2014 at 17:44 | Posted in Economics | Comments Off on All that glitters is not gold

Eighty years ago Keynes could congratulate Great Britain on finally having got rid of the biggest ”barbarous relic” of his time – the gold standard. He lamented that

advocates of the ancient standard do not observe how remote it now is from the spirit and the requirement of the age … [T]he long age of Commodity Money has at last passed away before the age of Representative Money. Gold has ceased to be a coin, a hoard, a tangible claim to wealth … It has become a much more abstract thing – just a standard of value; and it only keeps this nominal status by being handed round from time to time in quite small quantities amongst a group of Central Banks.

goldEnding the use of fiat money guaranteed by promises for currencies once more backed by gold is not the way out of the present economic crisis. Far from being the sole prophylactic against the alleged problems of fiat money, as the “gold bugs” maintain, a return to gold would only make things far worse. So yours truly – just as Keynes did – most certainly reject any proposals for restoring the gold standard.

The “gold bugs” seem to forget that we actually have tried the gold standard before – in the era more or less between 1870 and 1930 – and with disastrous results!


Implementing a new gold standard today would only lead to a generally falling price level. Sounds great? If you think so, read what Keynes wrote already eighty years ago in Essays in Persuasion:

Of course, a fall in prices, which is the the same thing as a rise in the value of claims on money, means that real wealth is transferred from the debtor in favour of the creditor, so that a larger proportion of the real assets is represented by the claims of the depositor, and a smaller proportion belongs to the nominal owner of the asset who has borrowed in order to buy.

Allowing this debt deflation process – the analysis of which was later developed by Irving Fisher and Hyman Minsky – would land us in a situation where output and wages would fall and unemployment and the real burden of debt would increase. The only winners would probably be banks and financial institutes.

So why would anyone want to reinstate a gold standard? The best surmise is probably that it’s a question of ideology and politics. Libertarians and market fundamentalists that advocate a return to gold, want to restrict the possibilities of governments to intervene in the economy and – even harder than with “independent” central banks – force countries to pursue restrictive economic policies that at all costs keep inflation down.

Still not convinced of why a return to gold is a bad idea? Then, at least, remember what Keynes wrote in The Economic Consequences of Mr Churchill (1925):

We stand midway between two theories of economic society. The one theory maintains that wages should be fixed by reference to what is ’fair’ and ’reasonable’ as between classes. The other theory–the theory of the economic juggernaut–is that wages should be settled by economic pressure, otherwise called ’hard facts’, and that our vast machine should crash along, with regard only to its equilibrium as a whole, and without attention to the chance consequences of the journey to individual groups. The gold standard, with its dependence on pure chance, its faith in the ’automatic adjustments’, and its general regardlessness of social detail, is an essential emblem and idol of those who sit in the top tier of the machine. I think that they are immensely rash… in their comfortable belief that nothing really serious ever happens. Nine times out of ten, nothing really does happen–merely a little distress to individuals or to groups. But we run a risk of the tenth time (and stupid into the bargain), if we continue to apply the principles of an economics, which was worked out on the hypothesis of laissez-faire and free competition, to a society which is rapidly abandoning these hypotheses.

gold_bugSo, next time you want to come up with some new idea on how to solve our economic problems with a magic gold bullet, remember new economic thinking starts with reading old books! Why not start with the best there are – those written by John Maynard Keynes.

All cried out

26 August, 2014 at 08:15 | Posted in Varia | Comments Off on All cried out

 

That voice still moves me everytime I hear it

Great Expectations

25 August, 2014 at 17:32 | Posted in Varia | Comments Off on Great Expectations

 
funny

Piketty — ett rött skynke på DN:s ledarsida

25 August, 2014 at 12:42 | Posted in Economics | 1 Comment

För några veckor sedan kunde vi läsa en söndagskrönika i Sydsvenskan — författad av förre ledarskribenten Per T Ohlsson — om Thomas Piketty’s Capital in the Twenty-First Century. Följande lilla stycke var belysande:

pikEn svaghet med Capital in the Twenty-First Century är emellertid att Pikettys alarmistiska slutsatser inte bygger på historiska data, utan på teoretiska modeller, “lagar”, som han själv har konstruerat utifrån högst diskutabla antaganden om sparande och tillväxt. Dessutom bortser Piketty från faktorer som bromsar den utbredning av ojämlikhet som han finner ödesbestämd, till exempel utbildning och ny teknik. Flera tunga ekonomer har påpekat detta, bland dem svensken Per Krusell tillsammans med kollegan Tony Smith från Yale.

Och idag kan vi på DN:s ledarsida läsa att de slutsatser om ekonomins framtid som Thomas Piketty lägger fram i sin bok ”Capital in the twenty-first Century,” vid en närmare granskning inte stämmer med verkligheten.

Vad baserar sig denna slutsats på? Jo, samma källa som Per T Ohlsson — Per Krusell.

Som läsare av denna blog kunnat konstatera under sommarens livliga forskardebatt kring Pikettys bok — se t. ex. här, här och här — är Krusells illa underbyggda —  och i vissa avseenden direkt felaktiga — modellantaganden långt ifrån med verkligheten överensstämmande. Krusell har — som den amerikanske nationalekonomen Brad DeLong övertygande visat — en minst sagt dålig verklighetsförankring vad avser de numeriska värden han laborerar med i sina modellbaserade försök att vederlägga Piketty:

Reality-Check-2As time passes, it seems to me that a larger and larger fraction of Piketty’s critics are making arguments that really make no sense at all–that I really do not understand how people can believe them, or why anybody would think that anybody else would believe them. Today we have Per Krusell and Tony Smith assuming that the economy-wide capital depreciation rate δ is not 0.03 or 0.05 but 0.1–and it does make a huge difference…

Per Krusell and Tony Smith: Piketty’s ‘Second Law of Capitalism’ vs. standard macro theory:

“Piketty’s forecast does not rest primarily on an extrapolation of recent trends … [which] is what one might have expected, given that so much of the book is devoted to digging up and displaying reliable time series…. Piketty’s forecast rests primarily on economic theory. We take issue…. Two ‘fundamental laws’, as Piketty dubs them… asserts that K/Y will, in the long run, equal s[net]/g…. Piketty… argues… s[net]/g… will rise rapidly in the future…. Neither the textbook Solow model nor a ‘microfounded’ model of growth predicts anything like the drama implied by Piketty’s theory…. Theory suggests that the wealth–income ratio would increase only modestly as growth falls …”

And if we go looking for why they believe that “theory suggests that the wealth–income ratio would increase only modestly as growth falls”, we find:

Per Krusell and Tony Smith: Is Piketty’s “Second Law of Capitalism” Fundamental? :

“In the textbook model … the capital-to-income ratio is not s[net]/g but rather s[gross]/(g+δ), where δ is the rate at which capital depreciates. With the textbook formula, growth approaching zero would increase the capital-output ratio but only very marginally; when growth falls all the way to zero, the denominator would not go to zero but instead would go from, say 0.12–with g around 0.02 and δ=0.1 as reasonable estimates–to 0.1.”

But with an economy-wide capital output ratio of 4-6 and a depreciation rate of 0.1, total depreciation–the gap between NDP and GDP–is not its actual 15% of GDP, but rather 40%-60% of GDP. If the actual depreciation rate were what Krussall and Smith say it is, fully half of our economy would be focused on replacing worn-out capital.

It isn’t … That makes no sense at all.

For the entire economy, one picks a depreciation rate of 0.02 or 0.03 or 0.05, rather than 0.10.

I cannot understand how anybody who has ever looked at the NIPA, or thought about what our capital stock is made of, would ever get the idea that the economy-wide depreciation rate δ=0.1.

And if you did think that for an instant, you would then recognize that you have just committed yourself to the belief that NDP is only half of GDP, and nobody thinks that–except Krusell and Smith. Why do they think that? Where did their δ=0.1 estimate come from? Why didn’t they immediately recognize that that deprecation estimate was in error, and correct it?

Why would anyone imagine that any growth model should ever be calibrated to such an extraordinarily high depreciation rate? …

I really do not understand what is going on here at all…

Brad DeLong

När det kommer till kritan visar det sig att de “tunga” invändningarna mot Piketty i själva verket väger mycket lätt …

Thomas Piketty har väckt viktiga frågor om hur vår ekonomi bäst tjänar mänskligheten, och han ger — till skillnad från DN:s ledare, som mot bättre vetande väljer att luta sig mot synpunkter framförda av en ekonom som sedan länge bemötts i forskarvärlden och visats vara dåligt underbyggda — meningsfulla svar på dessa frågor.

Och även om man inte är är övertygad om riktigheten av Pikettys svar, tål det att komma ihåg vad Jesper Roine skriver apropå sin nyligen på svenska utkomna sammanfattning av Pikettys bok:

Förhoppningen med att skriva denna sammanfattning har varit att den som inte hunnit läsa hela det engelska (eller franska) originalet snabbt ska kunna orientera sig kring resonemangen men framförallt att den ska locka fler att läsa Pikettys bok och fundera på dessa frågor. Som Larry Summers mycket träffande uttryckte sig i en kommentar till Pikettys bok: ”Böcker som ger det slutgiltiga svaret på en fråga är viktiga. Böcker som ställer nya frågor är viktigare”.

[För ekonomer: Krusell och andra kritiker av Piketty verkar tro att frågan handlar om teori och att Piketty på något sätt skulle ha felspecificerat standardtillväxtmodellen. Piketty talar i själva verket inte speciellt mycket om (Solows) standardtillväxtmodell i boken, men låt oss bara för skojs skull se vad en sådan neoklassisk tillväxtmodell säger. Anta att vi har en produktionsfunktion med homogenitetsgrad ett och obegränsad substituerbarhet — exempelvis en standard Cobb-Douglas produktionsfunktion (med A som en given produktivitetsparameter, och k som kvoten mellan kapital och arbete, K/L) y = Akα , med konstant investering λ av y och en konstant deprecieringskvot δ av “kapital per arbete” k, där ackumulationskvoten för k, Δk = λy– δk, är lika med Δk = λAkα– δk. I “steady state” (*) har vi λAk*α = δk*, vilket ger λ/δ = k*/y* och k* = (λA/δ)1/(1-α). Sätter vi in detta värdet av k* i produktions funktionen, får vi en “steady state” output per arbete y* = Ak*α= A1/(1-α)(λ/δ))α/(1-α). Under antagande av att vi har en exogen “Harrod-neutral” teknologiutveckling som ökar y med tillväxttakten g (under antagande av nolltillväxt i arbete och y och k a fortiori omdefinierat som respektive y/A och k/A, vilket ger produktionsfunktionen y = kα) får vi dk/dt = λy – (g + δ)k, vilket i Cobb-Douglas fallet dk/dt = λkα– (g + δ)k, med “steady state” värdet k* = (λ/(g + δ))1/(1-α) och kapital-outputkvoten k*/y* = k*/k*α = λ/(g + δ). Om vi använder oss av den av Piketty föredragna modellen där kapital och output ges i nettotermer (efter depreciering) måste vi ändra det senare uttrycket till k*/y* = k*/k*α = λ/(g + λδ). Nu gör Piketty förutsägelsen att g faller och att detta ökar kapital-outputkvoten. Låt oss säga att δ = 0.03, λ = 0.1 och g = 0.03 initialt. Detta ger en kapital-outputkvot på ca 3. Om g faller till 0.01 ökar kvoten till ca 7.7. Vi får analoga resultat om vi använder en s.k. “CES produktionsfunktion” med en substitutionselasticitet σ > 1. Med σ = 1.5, ökar kapitalandelen av outputen från 0.2 till 0.36 om “förmögenhets-inkomstkvoten” går från 2.5 to 5, vilket enligt Piketty är vad som faktiskt hände i de rika länderna under den senaste fyrtioårsperioden.]

Ricardian equivalence and DSGE models

24 August, 2014 at 18:16 | Posted in Economics | 2 Comments

out of the frying

Benchmark DSGE models have paid little attention to the role of fiscal policy, therefore minimising any possible interaction of fiscal policies with monetary policy. This has been partly because of the assumption of Ricardian equivalence. As a result, the distribution of taxes across time become irrelevant and aggregate financial wealth does not matter for the behavior of agents or for the dynamics of the economy because bonds do not represent net real wealth for households.

Incorporating more meaningfully the role of fiscal policies requires abandoning frameworks with the Ricardian equivalence. The question is how to break the Ricardian equivalence? Two possibilities are available. The first is to move to an overlapping generations framework and the second (which has been the most common way of handling the problem) is to rely on an infinite-horizon model with a type of liquidity constrained agents (eg “rule of thumb agents”).

Camillo Tovar

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

There is, of course, no reason for us to believe in that fairy-tale. Ricardo himself — mirabile dictu — didn’t believe in Ricardian equivalence. In “Essay on the Funding System” (1820) he wrote:

But the people who paid the taxes never so estimate them, and therefore do not manage their private affairs accordingly. We are too apt to think that the war is burdensome only in proportion to what we are at the moment called to pay for it in taxes, without reflecting on the probable duration of such taxes. It would be difficult to convince a man possessed of £20,000, or any other sum, that a perpetual payment of £50 per annum was equally burdensome with a single tax of £1000.

And as one Nobel Prize laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter 

So, I totally agree that macroeconomic models have to abandon Ricardian equivalence nonsense. But replacing it with “overlapping generations” and “infinite-horizon” models — isn’t that — in terms of realism and relevance — just getting out of the frying pan into the fire?

Sydsvenskans Per T Ohlsson — ännu en nonsens-pladdrande Pellejöns

24 August, 2014 at 11:18 | Posted in Politics & Society | 2 Comments

DumstrutEn av landets mest välbetalda journalister — Per T Ohlsson — har i sin återkommande söndagskrönika i Sydvenskan i dag en mer än vanligt dåligt underbyggd artikel.

Denna gången handlar det om vänsterpartiets nej till vinster i den skattefinansierade välfärdssektorn.

Följande lilla stycke är belysande:

Om Vänsterpartiet skulle få sin vilja igenom dör en hel bransch med 11 000 företag och 160 000 anställda. Då måste stat och kommun ta över verksamheterna till en kostnad av flera tiotals miljarder kronor …

Varför en tung sektor som välfärden skulle må bättre av att undantas från varje tillstymmelse till vinstintresse framstår som obegripligt. Men så blir det när blind ideologi sätts före verkligheten.

Sverige är förvisso ett litet land, men ändå lyckas våra marknadsfundamentalistiska stödtrupper vaska fram dumstrutsförsedda riks-pellejönsar av Per T Ohlssons och Carl B Hamiltons kaliber. Imponerande.

När det gäller sakfrågan här visar det sig när det kommer till kritan att Per T — som vanligt — är fullständigt ute och reser.

Vinstdrivande företag inom vård- och skolsektor har diskuterats mycket det senaste året. Många är med rätta upprörda.

Många som är verksamma inom skolvärlden eller vårdsektorn har haft svårt att förstå alliansens och socialdemokratins inställning till privatiseringar och vinstuttag i den mjuka välfärdssektorn. Av någon outgrundlig anledning har de under många år pläderat för att vinster ska vara tillåtna i skolor och vårdföretag. Ofta har argumentet varit att driftsformen inte har någon betydelse. Så är inte fallet. Driftsform och att tillåta vinst i välfärden har visst betydelse. Och den är negativ.

Allinasen och socialdemokratin är förvisso långt ifrån ensamt om sitt velande. Från Svenskt Näringsliv och landets alla ledarskribenter hörs en jämn ström av krav på ökad kontroll, tuffare granskning och inspektioner.

Men vänta lite nu! Var det inte så att när man på 1990-talet påbörjade systemskiftet inom välfärdssektorn ofta anförde som argument för privatiseringarna att man just skulle slippa den byråkratiska logikens kostnader i form av regelverk, kontroller och uppföljningar? Konkurrensen – denna marknadsfundamentalismens panacé – skulle ju göra driften effektivare och höja verksamheternas kvalitet. Marknadslogiken skulle tvinga bort de “byråkratiska” och tungrodda offentliga verksamheterna och kvar skulle bara finnas de bra företagen som “valfriheten” möjliggjort.

Och nu när den panglossianska privatiseringsvåtdrömmen visar sig vara en mardröm så ska just det som man ville bli av med – regelverk och “byråkratisk” tillsyn och kontroll – vara lösningen?

Man tar sig för pannan – och det av många skäl!

För ska man genomföra de åtgärdspaket som förs fram undrar man ju hur det går med den där effektivitetsvinsten. Kontroller, uppdragsspecifikationer, inspektioner m m kostar ju pengar och hur mycket överskott blir det då av privatiseringarna när dessa kostnader också ska räknas hem i kostnads- intäktsanalysen? Och hur mycket värd är den där “valfriheten” när vi ser hur den gång på gång bara resulterar i verksamhet där vinst genereras genom kostnadsnedskärningar och sänkt kvalitet?

All form av ekonomisk verksamhet bygger på eller inbegriper någon form av delegering. En part (uppdragsgivaren, principalen, beställaren) vill att en annan part (uppdragstagaren, agenten, utföraren) ska utföra en viss uppgift. Grundproblemet är hur beställaren ska få utföraren att utföra uppdraget på det sätt som beställaren önskar …

Det finns en uppenbar fara i att basera ersättningssystem på enkla objektiva mått när det vi vill ersätta i själva verket har flera och komplexa dimensioner, exempelvis ersättning efter antal utskrivna patienter, lärarlöner kopplade till betyg eller dylikt. Ofta har kommunala verksamheter denna karaktär av “fleruppgiftsverkamhet” och då fungerar ofta inte incitamentkontrakt eller provisioner. I sådana fall kan “byråkratier” vara mer ändamålsenliga än marknader …

Effektiv resursanvändning kan aldrig vara ett mål i sig. Däremot kan det vara ett nödvändigt medel för att nå uppsatta mål. Välfärdsstatens vara eller icke vara är därför i grunden inte bara en fråga om ekonomisk effektivitet, utan också om våra föreställningar om ett värdigt liv, rättvisa och lika behandling.

Lars Pålsson Syll et al, Vad bör kommunerna göra? (Jönköping University Press, 2002)

Så grundfrågan är inte om skattefinansierade privata företag ska få göra vinstuttag eller om det krävs hårdare tag i form av kontroll och inspektion. Grundfrågan är om det är marknadens och privatiseringarnas logik som ska styra våra välfärdsinrättningar eller om det ska ske via demokratins och politikens logik. Grundfrågan handlar om den gemensamma välfärdssektorn ska styras av demokrati och politik eller av marknaden.

Ingen borde svaja i denna fråga, speciellt inte efter att ha läst följande stycke, skrivet av den kanske främste nu levande nationalekonomen i världen, nobelpristagaren i ekonomi år 1972, Kenneth Arrow, som i ett klassiskt arbete om vårdsektorns ekonomi redan år 1963 skrev följande visa ord:

Under ideal insurance the patient would actually have no concern with the informational inequality between himself and the physician, since he would only be paying by results anyway, and his utility position would in fact be thoroughly guaranteed. In its absence he wants to have some guarantee that at least the physician is using his knowledge to the best advantage. This leads to the setting up of a relationship of trust and confidence, one which the physician has a social obligation to live up to … The social obligation for best practice is part of the commodity the physician sells, even though it is a part that is not subject to thorough inspection by the buyer.

One consequence of such trust relations is that the physician cannot act, or at least appear to act, as if he is maximizing his income at every moment of time. As a signal to the buyer of his intentions to act as thoroughly in the buyer’s behalf as possible, the physician avoids the obvious stigmata of profit-maximizing … The very word, ‘profit’ is a signal that denies the trust relation.

Kenneth Arrow, “Uncertainty and the Welfare Economics of Medical Care”, American Economic Review, 53 (5).

Så låt oss stilla be för att herr Ohlsson, nästa gång han  sätter sig ner för att skriva en artikel, först kollar vad forskarvärlden säger. Det ger mycket mer än tyckmyckentrutat nonsens-pladder!

RCTs — false validity claims

23 August, 2014 at 19:33 | Posted in Theory of Science & Methodology | 4 Comments

As yours truly has repeatedly argued (here here here) on this blog, RCTs usually do not provide evidence that their results are exportable to other target systems. The almost religious belief with which its propagators portray it, cannot hide the fact that RCTs cannot be taken for granted to give generalizable results. That something works somewhere is no warranty for it to work for us or even that it works generally.

In an extremely interesting article on the grand claims to external validity often raised by advocates of RCTs, Lant Pritchett  and Justin Sandefur now confirm this view and show that using an RCT is not at all the “gold standard” it is portrayed as:

 

Our point here is not to argue against any well-founded generalization of research findings, nor against the use of experimental methods. Both are central pillars of scientific research. As a means of quantifying the impact of a given development project, or measuring the underlying causal parameter of a clearly-specified economic model, field experiments provide unquestioned advantages over observational studies.

Test-GroupBut the popularity of RCTs in development economics stems largely from the claim that they provide a guide to making “evidence-based” policy decisions. In the vast majority of cases, policy recommendations based on experimental results hinge not only on the interior validity of the treatment effect estimates, but also on their external validity across contexts.

Inasmuch as development economics is a worthwhile, independent field of study – rather than a purely parasitic form of regional studies, applying the lessons of rich-country economies to poorer settings – its central conceit is that development is different. The economic, social, and institutional systems of poor countries operate differently than in rich countries in ways that are sufficiently fundamental to require different models and different data.

It is difficult if not impossible to adjudicate the external validity of an individual eperimental result in isolation. But experimental results do not exist in a vacuum. On many development policy questions, the literature as a whole — i. e., the combination of experimental and non-experimental results across multiple contexts — collectively invalidate any claim of external validity for any individual experimental result.

Lant Pritchett & Justin Sandefur

Fem skäl att ifrågasätta randomiserade kontrollstudier

23 August, 2014 at 10:17 | Posted in Theory of Science & Methodology | Comments Off on Fem skäl att ifrågasätta randomiserade kontrollstudier

rctYours truly har i ett antal artiklar här på bloggen — se t. ex. här, här och här — ifrågasatt värdet av randomiserade kontrollstudier (RKS) utifrån vetenskapsteoretiska och metodologiska utgångspunkter. I ett läsvärt gästinläggekonomistas ifrågasätter Björn Ekman starkt värdet av RKS som vägledning för biståndsarbete:

Randomisering är svårt. Det är i själva verket så knepigt att uppnå en ”ren” randomisering att det till och med kan vara en utmaning i laboratoriemiljö, dvs där metoden först uppstod och där man, i princip, kan kontrollera för alla de faktorer som man vill ta hänsyn till …

Randomisering behövs inte …

Randomisering svarar bara delvis på relevanta frågor. Ett problem med RKS-insatser som till och med många ”randomistas” åtminstone delvis håller med om är de relativt begränsade frågor som en sådan typ av insats ger svar på. Grovt sagt så svarar en randomiserad studie på om just den här insatsen fungerade just i denna kontext vid den här tidpunkten, dvs den har hög intern validitet. Det är inte oväsentliga svar, men de ger inte vid handen om insatsen bör utvidgas till andra områden och förhållanden. För svar på sådana frågor behövs studier med hög extern validitet.

Randomisering är inte tillämpbart inom stora delar av biståndet. Det är mycket möjligt, kanske till och med troligt, att det finns biståndsfinansierade projekt och insatser som bör designas och genomföras på ett randomiserat sätt för att utvärdera dess effekter. Men, de insatserna tillhör sannolikt minoriteten av allt det som finansieras av bistånd. Det allra mesta går inte att genomföra på ett slumpmässigt och kontrollerat sätt.

Dessutom finns det uppenbara etiska aspekter på en insats genomförande där man ger en grupp ett bevisligen effektivt medel, medan en annan grupp inte tilldelas samma medel …

Randomisering är inte kostnadseffektivt. Att genomföra en randomiserad kontrollstudie är dyrt, väldigt dyrt.

Annie Lööf har — som vanligt — otur när hon försöker tänka

23 August, 2014 at 09:49 | Posted in Politics & Society | Comments Off on Annie Lööf har — som vanligt — otur när hon försöker tänka

I Annie Lööfs sommartal och talet hon höll på Centerpartiets valkonvent nyligen berättade hon om en ung tjej som hon hade träffat. Tjejen hade gått från Fas 3 till att driva ett eget företag och var nu orolig för att hennes läxhjälpsföretag ska tvingas stänga om Alliansen förlorar valet.annie Därmed lyfts tjejen upp som en av alla de småföretagare som Centern går till val på att stödja och som de rödgröna vill tvinga bort från arbetsmarknaden. Lööf berättar att tjejen nu känner att hon duger, att hon tycker att hon ”tjänar så himla mycket pengar” för att hon förra året drog in 50 000 kronor, alltså 4 167 kronor i månaden. Det är mindre än det lägsta aktivitetsstödet …

Lucia Ramirez Tuesta, som är fritidsledare i Vårberg … håller med om att det måste vara skönt att ta sig ur Fas 3.

– Att kunna försörja sig själv är så klart en lättnad efter Fas 3, men jag tycker att det är sinnessjukt att det är så lite pengar. Jag har tjänat så några gånger när jag var timavlönad och inte fick jobba så mycket – efter att jag köpt för ett SL-kort, lite mat och betalat en räkning fanns inga pengar kvar. Ofta räckte det inte ens till räkningar.

Hon undrar hur Lööf, som tjänar 161 000 kronor i månaden, skulle klara sig på en sådan lön.

– Jag skulle gärna vilja se Annie Lööf leva på 4 000 kronor i månaden. Skulle hon klara att äta, ta sig till jobbet och ha en värdig fritid? Det är fullständigt idiotiskt att lyfta det som positivt. I slutändan handlar det ju om pengarna – vi lever i ett samhälle där pengar avgör om man har ett värdigt liv eller inte och jag undrar hur den här personen lever.

ETC

Then the chilly winds blew down (private)

23 August, 2014 at 09:15 | Posted in Varia | Comments Off on Then the chilly winds blew down (private)


In loving memory of my brother

Econometric business cycle research

21 August, 2014 at 19:33 | Posted in Statistics & Econometrics | 1 Comment

The wide conviction of the superiority of the methods of the science has converted the econometric community largely to a group of fundamentalist guards of mathematical rigour. It is often the case that mathemical rigour is held as the dominant goal and the criterion for research topic choice as well as research evaluation, so much so that the relevance of the research to business cycles is reduced to empirical illustrations. To that extent, probabilistic formalization has trapped econometric business cycle research in the pursuit of means at the expense of ends.

Economic_cycle.svg

Once the formalization attempts have gone significantly astray from what is needed for analysing and forecasting the multi-faceted characteristics of business cycles, the research community should hopefully make appropriate ‘error corrctions’ of its overestimation of the power of a priori postulated models as well as its underestimation of the importance of the historical approach, or the ‘art’ dimewnsion of business cycle research.

Duo Qin A History of Econometrics (OUP 2013)

Econometric forecasting — a retrospective assessment

20 August, 2014 at 12:11 | Posted in Statistics & Econometrics | Comments Off on Econometric forecasting — a retrospective assessment

411e9aO5PCL._SY344_BO1,204,203,200_There have been over four decades of econometric research on business cycles … The formalization has undeniably improved the scientific strength of business cycle measures …

But the significance of the formalization becomes more difficult to identify when it is assessed from the applied perspective, especially when the success rate in ex-ante forecasts of recessions is used as a key criterion. The fact that the onset of the 2008 financial-crisis-triggered recession was predicted by only a few ‘Wise Owls’ … while missed by regular forecasters armed with various models serves us as the latest warning that the efficiency of the formalization might be far from optimal. Remarkably, not only has the performance of time-series data-driven econometric models been off the track this time, so has that of the whole bunch of theory-rich macro dynamic models developed in the wake of the rational expectations movement, which derived its fame mainly from exploiting the forecast failures of the macro-econometric models of the mid-1970s recession.

The limits of econometric forecasting has, as noted by Qin, been critically pointed out many times before.

Trygve Haavelmo — with the completion (in 1958) of the twenty-fifth volume of Econometrica — assessed the the role of econometrics in the advancement of economics, and although mainly positive of the “repair work” and “clearing-up work” done, Haavelmo also found some grounds for despair:

We have found certain general principles which would seem to make good sense. Essentially, these principles are based on the reasonable idea that, if an economic model is in fact “correct” or “true,” we can say something a priori about the way in which the data emerging from it must behave. We can say something, a priori, about whether it is theoretically possible to estimate the parameters involved. And we can decide, a priori, what the proper estimation procedure should be … But the concrete results of these efforts have often been a seemingly lower degree of accuracy of the would-be economic laws (i.e., larger residuals), or coefficients that seem a priori less reasonable than those obtained by using cruder or clearly inconsistent methods.

Haavelmo-intro-2-125397_630x210There is the possibility that the more stringent methods we have been striving to develop have actually opened our eyes to recognize a plain fact: viz., that the “laws” of economics are not very accurate in the sense of a close fit, and that we have been living in a dream-world of large but somewhat superficial or spurious correlations.

And as the quote below shows, even Ragnar Frisch shared some of Haavelmo’s — and Keynes’s — doubts on the applicability of econometrics:

sp9997db.hovedspalteI have personally always been skeptical of the possibility of making macroeconomic predictions about the development that will follow on the basis of given initial conditions … I have believed that the analytical work will give higher yields – now and in the near future – if they become applied in macroeconomic decision models where the line of thought is the following: “If this or that policy is made, and these conditions are met in the period under consideration, probably a tendency to go in this or that direction is created”.

Ragnar Frisch

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge. And, more specifically,  when it comes to forecasting activities, the results have been bleak indeed.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that the legions of probabilistic econometricians who give supportive evidence for their considering it “fruitful to believe” in the possibility of treating unique economic data as the observable results of random drawings from an imaginary sampling of an imaginary population, are scating on thin ice. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that econometrics on the whole has not delivered “truth,” nor robust forecasts. And I doubt if it has ever been the intention of its main protagonists.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a more cautious attitude towards probabilistic inference of causality in economic contexts. Science should help us penetrate to — as Keynes put it — “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.”  We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour, since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes.

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

This is a more fundamental and radical problem than the celebrated “Lucas critique” have suggested. This is not the question if deep parameters, absent on the macro-level, exist in “tastes” and “technology” on the micro-level. It goes deeper. Real world social systems are not governed by stable causal mechanisms or capacities. It is the criticism that Keynes — in Essays in Biography — first launched against econometrics and inferential statistics already in the 1920s:

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of economic theoretical modeling – rather useless.

Leontief’s uneasy feeling about the state of economics

19 August, 2014 at 18:25 | Posted in Economics | 1 Comment

Much of current academic teaching and research has been criticized for its lack of relevance, that is, of immediate practical impact … I submit that the consistently indifferent performance in practical applications is in fact a symptom of a fundamental imbalance in the present state of our discipline. The weak and all too slowly growing empirical foundation clearly cannot support the proliferating superstructure of pure, or should I say, speculative economic theory …

004806Uncritical enthusiasm for mathematical formulation tends often to conceal the ephemeral substantive content of the argument behind the formidable front of algebraic signs … In the presentation of a new model, attention nowadays is usually centered on a step-by-step derivation of its formal properties. But if the author — or at least the referee who recommended the manuscript for publication — is technically com- petent, such mathematical manipulations, however long and intricate, can even without further checking be accepted as correct. Nevertheless, they are usually spelled out at great length. By the time it comes to interpretation of the substantive conclusions, the assumptions on which the model has been based are easily forgotten. But it is precisely the empirical validity of these assumptions on which the usefulness of the entire exercise depends.

What is really needed, in most cases, is a very difficult and seldom very neat assessment and verification of these assumptions in terms of observed facts. Here mathematics cannot help and because of this, the interest and enthusiasm of the model builder suddenly begins to flag: “If you do not like my set of assumptions, give me another and I will gladly make you another model; have your pick.” …

But shouldn’t this harsh judgment be suspended in the face of the impressive volume of econometric work? The answer is decidedly no. This work can be in general characterized as an attempt to compensate for the glaring weakness of the data base available to us by the widest possible use of more and more sophisticated statistical techniques. Alongside the mounting pile of elaborate theoretical models we see a fast-growing stock of equally intricate statistical tools. These are intended to stretch to the limit the meager supply of facts … Like the economic models they are supposed to implement, the validity of these statistical tools depends itself on the acceptance of certain convenient assumptions pertaining to stochastic properties of the phenomena which the particular models are intended to explain; assumptions that can be seldom verified.

Wassily Leontief

Pellejönsarnas Pellejöns — Carl B Hamilton

17 August, 2014 at 23:58 | Posted in Politics & Society | 1 Comment

dumstrut-2Folkpartiets ekonomiske talesman, Carl Bastiat Hamilton, är den grön-röda oppositionens starkaste kort i valrörelsen.

Inte nog med att herr Hamilton med en dåres envishet år efter återkommer med sina eurostollerier om att Sverige av “solidaritetsskäl” bör gå med i EMU. I gårdagens tv-sända ekonomidebatt fjantade han till en politisk motståndares argumentation kring det stötande  i att låta privatägda riskbolag sko sig på skattepengar och göra vinster inom skola och omvård — och hävdade att personen ifråga då istället alltså skulle tala sig varm för att vård och skola ska gå med förlust. Oh Herre du min milde! Och detta sanslösa fjant ska man behöva sitta och lyssna på i något som väl ändå försökte föreställa en seriös debatt.

Herr Hamilton är — för att travestera Torgny Segerstedt — en förolämpning. Jag vet inte om han är dum på riktigt eller bara har otur när han försöker tänka. Men för all del. Fortsätt skicka fram honom under valrörelsen. Det är antagligen den säkraste garantin för att han och hans allianskamrater inte kommer att ha något som helst inflytande över vår ekonomi under de närmsta fyra åren …

Wren-Lewis once more on why we pursue worthless forecasting

17 August, 2014 at 15:37 | Posted in Economics | 5 Comments

Simon Wren-Lewis today has yet another post up on forecasting activities. In an earlier post yours truly criticized his views, and now he seems to admit that there might be a point in questioning an activity like forecasting, that admittedly has little value and come up with results no better than “intelligent guessing.” Forecasting obviously isn’t — ” trivial” or not — a costless activity, so why pay for it then?

This time focusing on model-based forecasting by central banks, Wren-Lewis writes:

You can see the problem. By using an intelligent guess to forecast, the bank appears to be ignoring information, and it seems to be telling inconsistent stories. Central banks that are accountable do not want to get put in this position. From their point of view, it would be much easier if they used their main policy analysis model, plus judgement, to also make unconditional forecasts. They can always let the intelligent guesswork inform their judgement. If these forecasts are not worse than intelligent guesswork, then the cost to them of using the model to produce forecasts – a few extra economists – are trivial.

To me this whole discussion really underlines how important it is in social sciences — and economics in particular — to incorporate Keynes’s far-reaching and incisive analysis of induction and evidential weight in his seminal A Treatise on Probability (1921).

treatprobAccording to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but “rational expectations.” Keynes rather thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief,” beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modeled by “modern” social sciences. And often we “simply do not know.”

How strange that social scientists and mainstream economists as a rule do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical “probability.” In the quest for measurable quantities one puts a blind eye to qualities and looks the other way.

So why do companies, governments, and central banks, continue with this more or less expensive, but obviously worthless, activity?

A couple of months ago yours truly was interviewed by a public radio journalist working on a series on Great Economic ThinkersWe were discussing the monumental failures of the predictions-and-forecasts-business. But — the journalist asked — if these cocksure economists with their “rigorous” and “precise” mathematical-statistical-econometric models are so wrong again and again — why do they persist wasting time on it?

In a discussion on uncertainty and the hopelessness of accurately modeling what will happen in the real world — in M. Szenberg’s Eminent Economists: Their Life Philosophies — Nobel laureate Kenneth Arrow comes up with what is probably the most plausible reason:

It is my view that most individuals underestimate the uncertainty of the world. This is almost as true of economists and other specialists as it is of the lay public. To me our knowledge of the way things work, in society or in nature, comes trailing clouds of vagueness … Experience during World War II as a weather forecaster added the news that the natural world as also unpredictable. cloudsAn incident illustrates both uncer-tainty and the unwilling-ness to entertain it. Some of my colleagues had the responsi-bility of preparing long-range weather forecasts, i.e., for the following month. The statisticians among us subjected these forecasts to verification and found they differed in no way from chance. The forecasters themselves were convinced and requested that the forecasts be discontinued. The reply read approximately like this: ‘The Commanding General is well aware that the forecasts are no good. However, he needs them for planning purposes.’

To this one might also add some concerns about ideology and apologetics. Forecasting is a non-negligible part of the labour market for (mainstream) economists, and so, of course, those in the business do not want to admit that they are occupied with worthless things (not to mention how hard it would be to sell the product with that kind of frank truthfulness …). Governments, the finance sector and (central) banks also want to give the impression to customers and voters that they, so to say, have the situation under control (telling people that next years X will be 3.048 % makes wonders in that respect). Why else would anyone want to pay them or vote for them? These are sure not glamorous aspects of economics as a science, but as a scientist it would be unforgivably dishonest to pretend that economics doesn’t also perform an ideological function in society.

On the connection between neoclassical economics and neoliberalism

17 August, 2014 at 11:24 | Posted in Economics | Comments Off on On the connection between neoclassical economics and neoliberalism

Oxford professor Simon Wren-Lewis had a post up some time ago commenting on traction gaining “attacks on mainstream economics”:

One frequent accusation … often repeated by heterodox economists, is that mainstream economics and neoliberal ideas are inextricably linked. Of course economics is used to support neoliberalism. Yet I find mainstream economics full of ideas and analysis that permits a wide ranging and deep critique of these same positions. The idea that the two live and die together is just silly.

Hmmm …

Silly? Maybe. But maybe Wren-Lewis and other economists who want to enlighten themselves on the subject also should take a look at this video:

Or maybe read this essay, where I try to further analyze — much inspired by the works of Amartya Sen — what kind of philosophical-ideological-political-economic doctrine neoliberalism is, and why it so often comes natural for mainstream neoclassical economists to embrace neoliberal ideals.

den-dystra-vetenskapenOr maybe — if you know some Swedish — you could take a look in this book on the connection between the dismal science and neoliberalism (sorry for shameless self-promotion).

Arvo Pärt & Stars of the Lid

17 August, 2014 at 11:07 | Posted in Varia | Comments Off on Arvo Pärt & Stars of the Lid

 

The love we put into giving

16 August, 2014 at 21:00 | Posted in Varia | Comments Off on The love we put into giving

 

Lovely video — and don’t even for a second think it’s only because I’m a Swede and was a happy owner of a similar Saab back in the 80s …

Confirmation (private)

16 August, 2014 at 19:55 | Posted in Varia | 1 Comment

20140816_113715_resizedMy youngest daughter was confirmed today in All Saints Church in Lund, Sweden.
Congratulations Linnea!

The day I passed maths

16 August, 2014 at 19:37 | Posted in Varia | Comments Off on The day I passed maths

 

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.