Dann sind wir Helden

30 June, 2016 at 11:26 | Posted in Varia | Comments Off on Dann sind wir Helden



Mainstream economics — a pointless waste of time

29 June, 2016 at 12:04 | Posted in Economics | 4 Comments

Paul Krugman has a piece up on his blog arguing that the ‘discipline of modeling’ is a sine qua non for tackling politically and emotionally charged economic issues:

You might say that the way to go about research is to approach issues with a pure heart and mind: seek the truth, and derive any policy conclusions afterwards. But that, I suspect, is rarely how things work. After all, the reason you study an issue at all is usually that you care about it, that there’s something you want to achieve or see happen. Motivation is always there; the trick is to do all you can to avoid motivated reasoning that validates what you want to hear.

economist-nakedIn my experience, modeling is a helpful tool (among others) in avoiding that trap, in being self-aware when you’re starting to let your desired conclusions dictate your analysis. Why? Because when you try to write down a model, it often seems to lead some place you weren’t expecting or wanting to go. And if you catch yourself fiddling with the model to get something else out of it, that should set off a little alarm in your brain.

Hmm …

So when Krugman and other ‘modern’ mainstream economists use their models — standardly assuming rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative agents with homothetic and identical preferences, etc. — and standardly ignoring complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc. — we are supposed to believe that this somehow helps them ‘to avoid motivated reasoning that validates what you want to hear.’

Yours truly  is, to say the least,straight jacket far from convinced. The alarm that sets off in my brain is that this, rather than being helpful for understanding real world economic issues, sounds more like an ill-advised plaidoyer for voluntarily taking on a methodological straight-jacket of unsubstantiated and known to be false assumptions.

Let me just give two examples to illustrate my point.

In 1817 David Ricardo presented — in Principles — a theory that was meant to explain why countries trade and, based on the concept of opportunity cost, how the pattern of export and import is ruled by countries exporting goods in which they have comparative advantage and importing goods in which they have a comparative disadvantage.

Heckscher-Ohlin-HO-Modern-Theory-of-International-TradeRicardo’s theory of comparative advantage, however, didn’t explain why the comparative advantage was the way it was. In the beginning of the 20th century, two Swedish economists — Eli Heckscher and Bertil Ohlin — presented a theory/model/theorem according to which the comparative advantages arose from differences in factor endowments between countries. Countries have a comparative advantages in producing goods that use up production factors that are most abundant in the different countries. Countries would a fortiori mostly export goods that used the abundant factors of production and import goods that mostly used factors of productions that were scarce.

The Heckscher-Ohlin theorem –as do the elaborations on in it by e.g. Vanek, Stolper and Samuelson — builds on a series of restrictive and unrealistic assumptions. The most critically important — beside the standard market clearing equilibrium assumptions — are

(1) Countries use identical production technologies.

(2) Production takes place with a constant returns to scale technology.

(3) Within countries the factor substitutability is more or less infinite.

(4) Factor-prices are equalised (the Stolper-Samuelson extension of the theorem).

These assumptions are, as almost all empirical testing of the theorem has shown, totally unrealistic. That is, they are empirically false. 

That said, one could indeed wonder why on earth anyone should be interested in applying this theorem to real world situations. As so many other mainstream mathematical models taught to economics students today, this theorem has very little to do  with the real world.

From a methodological point of view one can, of course, also wonder, how we are supposed to evaluate tests of a theorem building on known to be false assumptions. What is the point of such tests? What can those tests possibly teach us? From falsehoods anything logically follows.

Modern (expected) utility theory is a good example of this. Leaving the specification of preferences without almost any restrictions whatsoever, every imaginable evidence is safely made compatible with the all-embracing ‘theory’ — and a theory without informational content never risks being empirically tested and found falsified. Used in mainstream economics ‘thought experimental’ activities, it may of course be very ‘handy’, but totally void of any empirical value.

Utility theory has like so many other economic theories morphed into an empty theory of everything. And a theory of everything explains nothing — just as Gary Becker’s ‘economics of everything’ it only makes nonsense out of economic science.

Some people have trouble with the fact that from allowing false assumptions mainstream economists can generate whatever conclusions they want in their models.

But that’s really nothing very deep or controversial. What I’m referring to is the well-known ‘principle of explosion,’ according to which if both a statement and its negation are considered true, any statement whatsoever can be inferred.

poppWhilst tautologies, purely existential statements and other nonfalsifiable statements assert, as it were, too little about the class of possible basic statements, self-contradictory statements assert too much. From a self-contradictory statement, any statement whatsoever can be validly deduced. Consequently, the class of its potential falsifiers is identical with that of all possible basic statements: it is falsified by any statement whatsoever.

On the question of tautology, I think it is only fair to say that the way axioms and theorems are formulated in mainstream (neoclassical) economics, they are often made tautological and informationally totally empty.

Using false assumptions, mainstream modelers can derive whatever conclusions they want. Wanting to show that ‘all economists consider austerity to be the right policy,’ just e.g. assume ‘all economists are from Chicago’ and ‘all economists from Chicago consider austerity to be the right policy.’  The conclusions follows by deduction — but is of course factually totally wrong. Models and theories building on that kind of reasoning is nothing but a pointless waste of time.

Why people have no faith in economics anymore

28 June, 2016 at 17:32 | Posted in Economics | Comments Off on Why people have no faith in economics anymore

In recent years the public has lost faith the in the economics profession.

One reason for the lack of faith is the failure to predict the Great Recession, but the public’s dismissal of macroeconomists is based upon more than the failure to foresee the dangers the housing bubble posed for the economy. It is also due to false promises about the benefits to the working class from globalization, tax cuts for the wealthy, and trade agreements – promises that were often used to support ideological and political goals or to serve special interests.

In retrospect the evidence for the housing bubble was easy to see and a few people tried to sound the alarm but they were widely dismissed. Even when the warnings were taken seriously the belief was that the consequences from a housing bubble collapse would be relatively minor and confined to the housing sector. Very few people believed there would be a deep and long-lasting recession …

reaganomics_trickle_downThe arguments used to justify austerity policies put forth both by reputable economists and promoted by those with an ideological agenda for smaller government – the idea that reducing government deficits would create a confidence effect and stimulate the economy – turned out to be wrong. Repeated warnings about inflation due to quantitative easing from economists with standing in the profession, warnings that proved false again and again, provided ammunition to those with a vendetta against the Fed.

If we go back a bit further in time, it’s easy to find more examples. Globalization and international trade were supposed to make us all better off. There would be adjustment costs along the way that would hopefully be offset by government policies to help those who paid the cost of the transition, but in the long-run more trade would lift all boats. But that hasn’t happened. Wages for the majority of people have stagnated, there have been large job displacements that government policy has not done much to address, and the gains have gone to those at the top of the income distribution.

Tax cuts for the wealthy were supposed to stimulate growth and make everyone better off. There was dispute about this within the profession, but there were also many economists who provided intellectual support for the claim that tax cuts will create growth and widespread prosperity. The evidence from the Bush and Reagan tax cuts does not support this claim, but it is still made by some economists and this gives those who are serving wealthy interests or who want to force government to shrink by starving it of revenue the cover they need for their arguments …

We do need more humility about what we do and do not know, more willingness to change our minds when the evidence disagrees with our favorite theoretical model, and the willingness to acknowledge disagreement within the profession. But most of all we need to take a strong stand against those inside and outside the profession who misuse economic theory and empirical results for political and ideological purposes.

Mark Thoma


People who have their heads fuddled with nonsense

28 June, 2016 at 09:33 | Posted in Economics | 3 Comments

The Conservative belief that there is some law of nature which prevents men from being employed, that it is “rash” to employ men, and that it is financially ‘sound’ to maintain a tenth of the population in idleness for an indefinite period, is crazily improbable – the sort of thing which no man could believe who had not had his head fuddled with nonsense for years and years … 0616_ig-john-maynard-keynes_1024x576Our main task, therefore, will be to confirm the reader’s instinct that what seems sensible is sensible, and what seems nonsense is nonsense. We shall try to show him that the conclusion, that if new forms of employment are offered more men will be employed, is as obvious as it sounds and contains no hidden snags; that to set unemployed men to work on useful tasks does what it appears to do, namely, increases the national wealth; and that the notion, that we shall, for intricate reasons, ruin ourselves financially if we use this means to increase our well-being, is what it looks like – a bogy.

John Maynard Keynes (1929)

Brexit shows the need for a reformed economics

27 June, 2016 at 23:33 | Posted in Economics, Politics & Society | 2 Comments

Brexit is about much more than frustration about the E.U. and immigration. It is about a shortage of decent and secure jobs; an impossibly precarious labour market; inexplicable inequalities in incomes and wealth; closed access to affordable education, and a terrible deficiency of affordable housing; and it is about British Chancellor of the Exchequer Osborne’s single-minded austerity economics and the rule-free and tax-free space created for big banks and corporations.

austerity-george-osborne-desktopThe referendum result reflects a deep-seated anger and anxiety amongst large sections of the population who are disenfranchised and feel ignored, and who can no longer bear the economic burden of living in the Thatcherite free-market wasteland (alternatively known as Cameron’s “Big Society”) that Britain has become – sadly reinforced by the New Labour governments that began with Tony Blair …

It would be a tragic mistake to read this resentment against the E.U. as only anti-migrant, racist or bigoted, because the racism and bigotry have grown in conditions of economic austerity, artificial job scarcity and crisis, rising unemployment, rising job insecurity, and exploding inequalities as social protection for workers, pensioners and families have been scaled down …

The responsibility for the economic and political mess in Britain, the E.U. and beyond weighs heavily on the shoulders of economists who insist there is no alternative to a globalized market economy (TINA!), with freedom for the rich and wealthy and unfreedom for the rest, and who out-of-hand reject serious progressive programmes to reform the system and make it more democratic and humane …

There are no easy answers – but economics urgently needs to start reforming itself, and asking the right questions.

Servaas Storm

What Brexit was all about

27 June, 2016 at 18:47 | Posted in Economics, Politics & Society | 1 Comment

Trickle-down-768x1024Societies where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implode.

In a market economy it is money that counts.

In a democracy it is your vote that counts.

If you’ve got money, you vote in.

If you haven’t got money, you vote out.

Brexit — a rejection of mainstream economics

26 June, 2016 at 13:53 | Posted in Economics, Politics & Society | 5 Comments

If, as a result of Brexit, the economy crashes it will not vindicate the economists, it will simply illustrate once more their failure.


We, at Policy Research in Macroeconomics (PRIME) call for an urgent, independent, public inquiry into the economics profession, and its role in precipitating both the financial crisis of 2007-9, the subsequent very slow ‘recovery’; and in the British European referendum campaign …

Economists have once again proved themselves not only irrelevant, but a dangerous irrelevance.

For too long they have resisted call after call for reform. If they will not do it themselves then it is time for others to take control. The profession should be brought to account through a public inquiry into the this failure.

While it is risky to second guess public opinion, it may just be that the prospect of hardship to come might not have been very compelling for those already suffering the hardship of low wages, insecure low-skilled jobs, bad housing, high rents, an under-resourced and increasingly privatised NHS, and other forms of public sector ‘austerity’.

With this historic vote, the British people have not just rejected the EU. They have done something that should worry the British establishment, and their friends in the City of London, and internationally, far more. They have rejected economics – and in particular the dominant economic narrative …

The “experts” and the economic stories they tell, have been well and truly walloped by the result of this referendum. And rightly so, because while there is truth in the story that international co-operation and co-ordination is vital to economic activity and stability, there is no sound basis to the widely espoused economic ‘religion’ that markets – in money, trade and labour – must be unfettered, detached from democratic regulatory oversight, and must be trusted to ‘govern’ whole countries, regions and continents.

The British people have today rejected this mainstream, orthodox economics, a strain of fundamentalism that they may rightly judge has proved deleterious to their own economic interests.

Ann Pettifor

And in case you — like e.g. Simon Wren-Lewis — think this kind of critique is only coming from ‘heterodox’ economists like Ann Pettifor and yours truly — well, then maybe you should read what a former Governor of the Bank of England has to say:

404168975Since the crisis, many have been tempted to play the game of deciding who was to blame for such a disastrous outcome … A generation of the brightest and best were lured into banking, and especially into trading, by the promise of immense financial rewards and by the intellectual challenge of the work that created such rich returns. They were badly misled. The crisis was a failure of a system and the ideas that underpinned it, not of individual policy-makers or bankers, incompetent and greedy though some of them undoubtedly were. There was a general misunderstanding of how the world economy worked …

If we don’t blame the actors, then why not the playwright? Economists have been cast by many as the villain. An abstract and increasingly mathematical discipline, economics is seen as having failed to predict the crisis. This is rather like blaming science for the occasional occurrence of a natural disaster. Yet we would blame scientists if incorrect theories made disasters more likely or created a perception that they could never occur, and one of the arguments of this book is that economics has encouraged ways of thinking that made crises more probable …

Brexit — and its disdain of the establishment — will send well-earned warnings to politicians all over the world …

Brexit has structural similarities with Trump’s rise. It is the logical outcome of the Conservative Party’s political strategy of the past twenty years. Conservatives used the European Union (EU) as a whipping boy to help smuggle in their “Thatcher – Reagan” neoliberal economic policies. The Labor Party spoke out in defense of minorities, but it did not defend the EU and nor did it adequately confront neoliberalism.

trumpIn the US, Trump is the analogue “exit” candidate. His rise is the logical outcome of thirty years, during which Republicans used dog-whistle racism and the culture war to smuggle through their neoliberal economic agenda that has wrought the destruction of shared prosperity. Democrats resisted racism and the culture war, but were complicit in the promotion of neoliberalism.

The lesson for the Clinton campaign is it must move beyond rhetoric criticizing neoliberalism and adopt serious remedies that tackle its legacy of inequality, economic insecurity and loss of hope. Neoliberalism is the ultimate cause of the establishment’s rejection. Racism, immigration and nationalism may be the match for the anti-establishment fire: wage stagnation and off-shoring of jobs are the fuel.

Thomas Palley

EU after Brexit

26 June, 2016 at 10:57 | Posted in Economics, Politics & Society | 1 Comment

There will be a lot of postmortems for the European Union (EU) after Brexit. Many will suggest that this was a victory against the neoliberal policies of the European Union …

6069_eu_austerity_infographic-oix1000The problem is that while it is true that the EU leaders have been part of the problem and have pursued the neoliberal policies within the framework of the union, sometimes with treaties like the Fiscal Compact, it is far from clear that Brexit and the possible demise of the union, if the fever spreads to France, Germany and other countries with their populations demanding their own referenda, will lead to the abandonment of neoliberal policies. Austerity will most likely continue …

Most of the austerity policies imposed on the peripheral countries are actually the result of the euro, and are to a great extent independent of the existence of a broader political union …

Personally, I cannot see that the disintegration of Europe would lead to a positive outcome. Sure the EU has a significant democratic deficit, and a bureaucracy that is seen as wasteful and inefficient … The same is true of American democracy.

At a minimum the European Union provided an environment in which people could move freely, in which petty nationalism gave way to acceptance of foreigners and immigrants, something particularly relevant with the refugee crisis in the neighboring region. Some may suggest that this was very little to show for. And the alternative, does it have something to show for? If the European Union really collapses, there will be very little for progressives to be happy about.

Matias Vernengo

Bad ideas never die — Greg Mankiw’s Alesina fairy tale

25 June, 2016 at 09:19 | Posted in Economics | Comments Off on Bad ideas never die — Greg Mankiw’s Alesina fairy tale

So what’s wrong with the economy? …

austerity_world_tour_greeceA 2002 study of United States fiscal policy by the economists Olivier Blanchard and Roberto Perotti found that ‘both increases in taxes and increases in government spending have a strong negative effect on private investment spending.’ They noted that this finding is ‘difficult to reconcile with Keynesian theory.’

Consistent with this, a more recent study of international data by the economists Alberto Alesina and Silvia Ardagna found that ‘fiscal stimuli based on tax cuts are more likely to increase growth than those based on spending increases.’

Greg Mankiw

From Mankiw’s perspective ‘the Alesina work suggests a still plausible hypothesis.’

Hmm …


Austerity policies not only generate substantial welfare costs due to supply-side channels, they also hurt demand — and thus worsen employment and unemployment.The notion that fiscal consolidations can be expansionary (that is, raise output and employment), in part by raising private sector confidence and investment, has been championed by, among others, Harvard economist Alberto Alesina in the academic world and by former European Central Bank President Jean-Claude Trichet in the policy arena. austerity-meme-sequester-thisHowever, in practice, episodes of fiscal consolidation have been followed, on average, by drops rather than by expansions in output. On average, a consolidation of 1 percent of GDP increases the long-term unemployment rate by 0.6 percentage point and raises by 1.5 percent within five years the Gini measure of income inequality.

Jonathan Ostry, Prakash Loungani, and David Furceri

Why Brexit won

24 June, 2016 at 10:42 | Posted in Economics, Politics & Society | 6 Comments


The EU establishment has been held to account for the euro mess, for austerity policies that turned recession into depression, for the galloping inequality, and for the millions and millions of unemployed.

The EU austerity policies breads understandable and righteous anger — but also ugly far right xenophobic political movements taking advantage of the frustration that austerity policies inevitably produce. Ultimately this underlines the threats to society that austerity policies and mass unemployment are.

The neoliberal austerity policies pursued in the UK and elsewhere is deeply disturbing. When an economy is already hanging on the ropes, you can’t just cut government spendings. Cutting government expenditures reduces the aggregate demand. Lower aggregate demand means lower tax revenues. Lower tax revenues means increased deficits — and calls for even more austerity. And so on, and so on.

Without a conscious effort to counteract the inevitable forces driving our societies towards an extreme income and wealth inequality, our societies crackle. It is crucial to have strong redistributive policies if we want to have stable economies and societies. Redistributive taxes and active fiscal policies are necessary ingredients for building a good society.

Societies where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implode. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed.

In a society with a huge shortage of homes, a precarious job market, and a marginalized and pressured working class, EU to a large extent becomes a question of class and inequality.

In a market economy it is money that counts.

In a democracy it is your vote that counts.

If you’ve got money, you vote in. If you haven’t got money, you vote out.

Modern economics — the victory of technique over substance

23 June, 2016 at 18:18 | Posted in Economics | 1 Comment

mblaugphotoModern economics is sick. Economics has increasingly become an intellectual game played for its own sake and not for its practical consequences for understanding the economic world. Economists have converted the subject into a sort of social mathematics in which analytical rigour is everything and practical relevance is nothing. To pick up a copy of The American Economic Review or The Economic Journal these days is to wonder whether one has landed on a strange planet in which tedium is the deliberate objective of professional publication. Economics was once condemned as “the dismal science” but the dismal science of yesterday was a lot less dismal than the soporific scholasticism of today …

If there is such a thing as “original sin” in economic methodology, it is the worship of the idol of the mathematical rigour invented by Arrow and Debreu in 1954 and then canonized by Debreu in his Theory of Value five years later, probably the most arid and pointless book in the entire literature of economics.

The result of all this is that we now understand almost less of how actual markets work than did Adam Smith or even Léon Walras. We have forgotten that markets require market-makers, that middlemen have to hold inventories to allow markets to function, that markets need to be organized and that property rights need to be defined and enforced if markets are to get started at all. We have even forgotten that markets adjust as often in terms of quantities rather than prices, as in labour markets and customer commodity markets, as Alfred Marshall knew very well but Walras overlooked; so well have we forgotten that fact that a whole branch of economics sprang up in the 1960s and 70s to provide “microfoundations” for Keynesian macroeco- nomics, that is, some ad hoc explanation for the fact that a decline in aggregate demand causes unemployment at the same real wage and not falling real wages at the same level of employment …

Indeed, much of modern microeconomics might be fairly described as a kind of geography that consists entirely of images of cities but providing no maps of how to reach a city either from any other city or from the countryside.

Mark Blaug

Mark Blaug (1927-2011) did more than any other single person to establish the philosophy and methodology of economics a respected subfield within economics. His path-breaking The methodology of economics (1980) is still a landmark — and the first textbook on economic methodology yours truly had to read as student.

Mainstream — neoclassical — economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models — as long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live — is beyond comprehension. Stupid models are of no or little help in understanding the real world.

To my students

23 June, 2016 at 13:26 | Posted in Varia | Comments Off on To my students


Good Hayek vs. Bad Hayek

23 June, 2016 at 13:06 | Posted in Economics | 1 Comment

The source of confusion is that there was a Good Hayek and a Bad Hayek. The Good Hayek was a serious scholar who was particularly interested in the role of knowledge in the economy (and in the rest of society). Since knowledge—about technological possibilities, about citizens’ preferences, about the interconnections of these, about still more—is inevitably and thoroughly decentralized, the centralization of decisions is bound to generate errors and then fail to correct them. The consequences for society can be calamitous, as the history of central planning confirms. That is where markets come in. All economists know that a system of competitive markets is a remarkably efficient way to aggregate all that knowledge while preserving decentralization.

The Good Hayek also knew that unrestricted laissez-faire is unworkable. It has serious defects: successful actors reach for monopoly power, and some of them succeed in grasping it; better-informed actors can exploit the relatively ignorant, creating an inefficiency in the process; the resulting distribution of income may be grossly unequal and widely perceived as intolerably unfair; industrial market economies have been vulnerable to excessively long episodes of unemployment and underutilized capacity, not accidentally but intrinsically; environmental damage is encouraged as a way of reducing private costs—the list is long …

The Bad Hayek emerged when he aimed to convert a wider public. Then, as often happens, he tended to overreach, and to suggest more than he had legitimately argued. The Road to Serfdom was a popular success but was not a good book. Leaving aside the irrelevant extremes, or even including them, it would be perverse to read the history, as of 1944 or as of now, as suggesting that the standard regulatory interventions in the economy have any inherent tendency to snowball into “serfdom.” The correlations often run the other way.

Robert Solow

Rule of law

21 June, 2016 at 19:09 | Posted in Politics & Society | Comments Off on Rule of law


Det är nu snart femton år sedan som Fadime Sahindal bestialiskt mördades av anhöriga för att hon själv ville välja hur hon skulle leva sitt liv.

Den typen av hedersrelaterat våld har ibland försvarats med — djupt förfelade — kulturrelativistiska resonemang där kulturella skillnader setts som en i något avseende förmildrande omständighet.

fadimeMen — i Sverige har kvinnor och män samma värde. Och alla som lever i Sverige måste respektera detta.

Sverige ska vara ett öppet land. En del av världssamfundet.

Men det ska också vara ett land som slår fast att de landvinningar i termer av jämlikhet, öppenhet och tolerans som vi tillkämpat oss under sekler inte är förhandlingsbara.

Människor som kommer till vårt land ska åtnjuta dessa rättigheter och friheter.

Men med dessa rättigheter och friheter kommer också en skyldighet. Alla — utan undantag — måste också acceptera att i vårt land gäller en lag — lika för alla.

Rule of law.

En långtgående kulturrelativism har medfört en sorts förvärvad stupiditet, som gör att man hellre söker förtiga kulturrelaterade problem och låtsas som om de inte finns än att åtgärda dem. Alternativt skuldbelägger man sig själv, för att slippa ta i den besvärliga konflikten med Den Andre.

Per Bauhn

Ayn Rand — a psychopath and perverter of American History

21 June, 2016 at 17:12 | Posted in Politics & Society | 2 Comments

Now, I don’t care to discuss the alleged complaints American Indians have against this country. I believe, with good reason, the most unsympathetic Hollywood portrayal of Indians and what they did to the white man. They had no right to a country merely because they were born here and then acted like savages. The white man did not conquer this country …

Since the Indians did not have the concept of property or property rights—they didn’t have a settled society, they had predominantly nomadic tribal “cultures”—they didn’t have rights to the land, and there was no reason for anyone to grant them rights that they had not conceived of and were not using …

What were they fighting for, in opposing the white man on this continent? For their wish to continue a primitive existence; for their “right” to keep part of the earth untouched—to keep everybody out so they could live like animals or cavemen. Any European who brought with him an element of civilization had the right to take over this continent, and it’s great that some of them did. The racist Indians today—those who condemn America—do not respect individual rights.

Ayn Rand, Address To The Graduating Class Of The United States Military Academy at West Point, 1974

It’s sickening to read this gobsmacking trash. But it’s perhaps even more sickening that people like Alan Greenspan and Paul Ryan can consider Ayn Rand an intellectual hero

That Alan Greenspan is a bad economist we already knew. But he’s also a bad person. For what else can one think of a person that considers Ayn Rand — with the ugliest psychopathic philosophy the postwar world has produced — one of the great thinkers of the 20th century? A person that even co-edited a book with her — maintaining that unregulated capitalism is a ‘superlatively moral system.’ A person that in his memoirs tries to reduce his admiration for Rand to a youthful indiscretion — but who actually still today can’t be described as anything else than a loyal Randian disciple.

Ayn Rand and her objectivist philosophy has more disciples than Greenspan. But as Hilary Putnam rightfully noticed in The Collapse of the Fact/Value Dichotomy (Harvard University Press, 2002), is it doubtful if it even qualifies as a real philosophy:

It cannot be the case that the only universally valid norm refers solely to discourse. It is, after all, possible for someone to recognize truth-telling as a binding norm while otherwise being guided solely by ‘enlightened egoism.’ (This is, indeed, the way of life that was recommended by the influential if amateurish philosophizer – I cannot call her a philosopher – Ayn Rand.) But such a person can violate the spirit if not the letter of the principle of communicative action at every turn. After all, communicative action is contrasted with manipulation, and as such a person can manipulate people without violating the maxims of ‘sincerity, truth-telling, and saying only what one believes to be rationally warranted.’

This blog post is in loving memory of my brother Peter ‘Uncas’ Pålsson — truly ‘a red man deep inside.’

The euro — gold cage of our time

19 June, 2016 at 09:17 | Posted in Economics | 2 Comments


The euro has taken away the possibility for national governments to manage their economies in a meaningful way — and in Greece the people has had to pay the true costs of its concomitant misguided austerity policies.


The unfolding of the Greek tragedy during the last couple of years has shown beyond any doubts that the euro is not only an economic project, but just as much a political one. What the neoliberal revolution during the 1980s and 1990s didn’t manage to accomplish, the euro shall now force on us.

But do the peoples of Europe really want to deprive themselves of economic autonomy, enforce lower wages and slash social welfare at the slightest sign of economic distress? Is inreasing income inequality and a federal überstate really the stuff that our dreams are made of? I doubt it.

History ought to act as a deterrent. During the 1930s our economies didn’t come out of the depression until the folly of that time — the gold standard — was thrown on the dustbin of history. The euro will hopefully soon join it.

Mainstream economists have a tendency to get enthralled by their theories and models, and forget that behind the figures and abstractions there is a real world with real people. Real people that have to pay dearly for fundamentally flawed doctrines and recommendations.

Let’s make sure the consequences will rest on the conscience of those economists.


The confidence fairy history

18 June, 2016 at 17:01 | Posted in Economics | 1 Comment

confidence-next-exitBrad DeLong has an excellent presentation on the history of the confidence fairy up on his blog.

What I especially like with Brad’s history is that it makes it crystal clear how hard it has been for mainstream economists to grasp the simple fact that no matter how much confidence you have in the policies pursued by authorities, it cannot turn bad austerity policies into good job creating policies. Austerity measures and overzealous and simple-minded fixation on monetary measures and inflation, are — almost without exception — not what it takes to get limping economies out of their limbo. They simply do not get us out of the ‘magneto trouble’ — and neither does budget deficit discussions where economists and politicians seem to think that cutting government budgets would help us out of recessions and slumps. Although the ‘credibility’ that mainstream economists talk about arguably has some impact on the economy, the confidence fairy does not create recovery or offset the negative effects of Alessina-like ‘expansionary fiscal austerity.’ In a situation where monetary policies has become more and more decrepit, the solution is not fiscal austerity, but fiscal expansion!

David Andolfatto’s DSGE flimflam

15 June, 2016 at 17:03 | Posted in Economics | 1 Comment

David Andolfatto — vice president at Federal Reserve Bank of St. Louis — has a post up on his blog trying to defend the much criticised use of DSGE models. According to Andalfatto

‘to help organize our thinking’, it is often useful to construct mathematical representations of our theories — not as a substitute, but as a complement to the other tools in our tool kit (like basic intuition).
1643.Lebowski.jpg-610x0This is a useful exercise if for no other reason than it forces us to make our assumptions explicit, at least, for a particular thought experiment. We want to make the theory transparent (at least, for those who speak the trade language) and therefore easy to criticize.

We’ve heard this line of ‘defence’ before, and it’s as little convincing as ever. But as extra ammunition in defending DSGE for policy issues, Andolfatto refers us to an interview with Nobel laureate Tom Sargent.

So let’s see if there’s anything in that interview that would make us believe in this ‘help us organise our thinking’ fairytale. Sargent gives the following defense of ‘modern macro’ (my emphasis):

Sargent: I know that I’m the one who is supposed to be answering questions, but perhaps you can tell me what popular criticisms of modern macro you have in mind.

Rolnick: OK, here goes. Examples of such criticisms are that modern macroeconomics makes too much use of sophisticated mathematics to model people and markets; that it incorrectly relies on the assumption that asset markets are efficient in the sense that asset prices aggregate information of all individuals; that the faith in good outcomes always emerging from competitive markets is misplaced; that the assumption of “rational expectations” is wrongheaded because it attributes too much knowledge and forecasting ability to people; that the modern macro mainstay “real business cycle model” is deficient because it ignores so many frictions and imperfections and is useless as a guide to policy for dealing with financial crises; that modern macroeconomics has either assumed away or shortchanged the analysis of unemployment; that the recent financial crisis took modern macro by surprise; and that macroeconomics should be based less on formal decision theory and more on the findings of “behavioral economics.” Shouldn’t these be taken seriously?

Sargent: Sorry, Art, but aside from the foolish and intellectually lazy remark about mathematics, all of the criticisms that you have listed reflect either woeful ignorance or intentional disregard for what much of modern macroeconomics is about and what it has accomplished. That said, it is true that modern macroeconomics uses mathematics and statistics to understand behavior in situations where there is uncertainty about how the future will unfold from the past. But a rule of thumb is that the more dynamic, uncertain and ambiguous is the economic environment that you seek to model, the more you are going to have to roll up your sleeves, and learn and use some math. That’s life.

Are these the words of an ’empirical’ and ‘transparent’ macroeconomist? I’ll be dipped! To me it sounds like the same old axiomatic-deductivist mumbo-jumbo that parades as economic science of today.

Mainstream economic theory today is in the story-telling business whereby economic theorists create make-believe analogue models of the real economic system. This modeling activity is — as both Andolfatto and Sargent give ample evidence of — considered useful and essential. Since fully-fledged experiments on a societal scale as a rule are prohibitively expensive, ethically indefensible or unmanageable, economic theorists have to substitute experimenting with something else. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models” rather than engineering things happening in real economies.

Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of the other part of the model-target dyad.

Mainstream economics — and especially of the Chicago ilk — has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. But, hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism, relevance, and realism, rather than formalistic tractability.

pinnocI remember attending the first lecture in Tom Sargent’s evening macroeconomics class back when I was in undergraduate: very smart man from whom I have learned the enormous amount, and well deserving his Nobel Prize. But…

He said … we were going to build a rigorous, micro founded model of the demand for money: We would assume that everyone lived for two periods, worked in the first period when they were young and sold what they produced to the old, held money as they aged, and then when they were old use their money to buy the goods newly produced by the new generation of young. Tom called this “microfoundations” and thought it gave powerful insights into the demand for money that you could not get from money-in-the-utility-function models.

I thought that it was a just-so story, and that whatever insights it purchased for you were probably not things you really wanted to buy. I thought it was dangerous to presume that you understood something because you had “microfoundations” when those microfoundations were wrong. After all, Ptolemaic astronomy had microfoundations: Mercury moved more rapidly than Saturn because the Angel of Mercury left his wings more rapidly than the Angel of Saturn and because Mercury was lighter than Saturn…

Brad DeLong

Andolfatto — not to mention Sargent — seems to be impressed by the ‘rigour’ brought to macroeconomics by new classical DSGE models and its rational expectations, microfoundations and  ‘Lucas Critique’.

It is difficult to see why.

3634flimTake the rational expectations assumption for example. Rational expectations in the mainstream economists’s world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

‘Rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.


No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about real world economies.

Proving things ‘rigorously’ in DSGE models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Mainstream broken pieces models

15 June, 2016 at 15:06 | Posted in Economics | 3 Comments

In economic modeling, people often just assume an objective function for one agent or another, throw that into a larger model, and then look only at some subset of the model’s overall implications. But that’s throwing away data …

And in doing so, it dramatically lowers the empirical bar that a model has to clear. You’re essentially tossing a ton of broken, wrong structural assumptions into a model and then calibrating (or estimating) the parameters to match a fairly small set of things, then declaring victory. But because you’ve got the structure wrong, the model will fail and fail and fail as soon as you take it out of sample, or as soon as you apply it to any data other than the few things it was calibrated to match.

PiggyPieces.1.pngceca4f46-cf6c-428a-8a60-8a9f9f620d19LargeUse broken pieces, and you get a broken machine …

Dani Rodrik, when he talks about these issues, says that unrealistic assumptions are only bad if they’re ‘critical’ assumptions – that is, if changing them would change the model substantially. It’s OK to have non-critical assumptions that are unrealistic, just like a car will still run fine even if the cup-holder is cracked. That sounds good. In principle I agree. But in practice, how the heck do you know in advance which assumptions are critical? You’d have to go check them all, by introducing alternatives for each and every one (actually every combination of assumptions, since model features tend to interact). No one is actually going to do that. It’s a non-starter.

The real solution, as I see it, is not to put any confidence in models with broken pieces.

Noah Smith

No indeed, there’s no reason whatsoever to trust those models and their defenders.

51A1kO+7AoL._SX329_BO1,204,203,200_Dani Rodrik’s Economics Rules  describes economics as a more or less problem-free smorgasbord collection of models. Economics is portrayed as advancing through a judicious selection from a continually expanding library of models, models that are presented as ‘partial maps’ or ‘simplifications designed to show how specific mechanisms  work.’

But one of the things that’s missing in Rodrik’s view of economic models is the all-important distinction between core and auxiliary assumptions. Although Rodrik repeatedly speaks of ‘unrealistic’ or ‘critical’ assumptions, he basically — as Noah Smith rightly remarks — just lumps them all together without differentiating between different types of assumptions, axioms or theorems.

Modern mainstream (neoclassical) economists ground their models on a set of core assumptions (CA) — basically describing the agents as ‘rational’ actors — and a set of auxiliary assumptions (AA). Together CA and AA make up what I will call the ur-model (M) of all mainstream neoclassical economic models. Based on these two sets of assumptions, they try to explain and predict both individual (micro) and — most importantly — social phenomena (macro).

The core assumptions typically consist of:

CA1 Completeness — rational actors are able to compare different alternatives and decide which one(s) he prefers

CA2 Transitivity — if the actor prefers A to B, and B to C, he must also prefer A to C.

CA3 Non-satiation — more is preferred to less.

CA4 Maximizing expected utility — in choice situations under risk (calculable uncertainty) the actor maximizes expected utility.

CA4 Consistent efficiency equilibria — the actions of different individuals are consistent, and the interaction between them result in an equilibrium.

When describing the actors as rational in these models, the concept of rationality used is instrumental rationality – choosing consistently the preferred alternative, which is judged to have the best consequences for the actor given his in the model exogenously given wishes/interests/ goals. How these preferences/wishes/interests/goals are formed is typically not considered to be within the realm of rationality, and a fortiori not constituting part of economics proper.

The picture given by this set of core assumptions (rational choice) is a rational agent with strong cognitive capacity that knows what alternatives he is facing, evaluates them carefully, calculates the consequences and chooses the one — given his preferences — that he believes has the best consequences according to him.

Weighing the different alternatives against each other, the actor makes a consistent optimizing (typically described as maximizing some kind of utility function) choice, and acts accordingly.

Beside the core assumptions (CA) the model also typically has a set of auxiliary assumptions (AA) spatio-temporally specifying the kind of social interaction between ‘rational actors’ that take place in the model. These assumptions can be seen as giving answers to questions such as

AA1 who are the actors and where and when do they act

AA2 which specific goals do they have

AA3 what are their interests

AA4 what kind of expectations do they have

AA5 what are their feasible actions

AA6 what kind of agreements (contracts) can they enter into

AA7 how much and what kind of information do they possess

AA8 how do the actions of the different individuals/agents interact with each other.

So, the ur-model of all economic models basically consist of a general specification of what (axiomatically) constitutes optimizing rational agents and a more specific description of the kind of situations in which these rational actors act (making AA serve as a kind of specification/restriction of the intended domain of application for CA and its deductively derived theorems). The list of assumptions can never be complete, since there will always unspecified background assumptions and some (often) silent omissions (like closure, transaction costs, etc., regularly based on some negligibility and applicability considerations). The hope, however, is that the ‘thin’ list of assumptions shall be sufficient to explain and predict ‘thick’ phenomena in the real, complex, world.

But in Rodrik’s model depiction we are essentially given the following structure,

A1, A2, … An

where a set of undifferentiated assumptions are used to infer a theorem.

This is, however, to vague and imprecise to be helpful, and does not give a true picture of the usual mainstream modeling strategy, where there’s a differentiation between a set of law-like hypotheses (CA) and a set of auxiliary assumptions (AA), giving the more adequate structure

CA1, CA2, … CAn & AA1, AA2, … AAn


CA1, CA2, … CAn
(AA1, AA2, … AAn) → Theorem,

more clearly underlining the function of AA as a set of (empirical, spatio-temporal) restrictions on the applicability of the deduced theorems.

This underlines the fact that specification of AA restricts the range of applicability of the deduced theorem. In the extreme cases we get

CA1, CA2, … CAn

where the deduced theorems are analytical entities with universal and totally unrestricted applicability, or

AA1, AA2, … AAn

where the deduced theorem is transformed into an untestable tautological thought-experiment without any empirical commitment whatsoever beyond telling a coherent fictitious as-if story.

Not clearly differentiating between CA and AA means that Rodrik can’t make this all-important interpretative distinction, and so opens up for unwarrantedly ‘saving’ or ‘immunizing’ models from almost any kind of critique by simple equivocation between interpreting models as empirically empty and purely deductive-axiomatic analytical systems, or, respectively, as models with explicit empirical aspirations. Flexibility is usually something people deem positive, but in this methodological context it’s more troublesome than a sign of real strength. Models that are compatible with everything, or come with unspecified domains of application, are worthless from a scientific point of view.

Mainstream macro models are nothing but broken pieces models — and as Noah Smith puts it — there is no reason for us ‘to put any confidence in models with broken pieces.’

Economics — spending time doing silly things

14 June, 2016 at 19:26 | Posted in Economics | 3 Comments

Mainstream macro theory has not really helped out the finance industry, the Fed, or coffee house discussions. The reason … is basically that DSGE models don’t work …

But [if] DSGE models really don’t work, why do so many macroeconomists spend so much time on them? …

What if it’s signaling? …

no-sigThat suspicion was probably planted in 2005 … by a Japanese economist I knew … He gave me his advice on how to have an econ career: “First, do some hard math thing, like functional analysis. Then everyone will know you’re smart, and you can do easy stuff” …

I then watched a number of my grad school classmates go into macroeconomics. Their job market papers all were mainly theory papers, though – in keeping with typical macro practice – they had an empirical section that was usually closely related to the theory. The models all struck me as hopelessly unrealistic and silly, of course, and in private my classmates – the ones I talked to – agreed that this was the case, and said lots of mean things about DSGE modeling in general, basically saying “This is the game we have to play.” …

the_ministry_of_silly_walksIf macroeconomics research is a coordination game, and if the prevailing research paradigm is not really better than alternatives, then you probably want macroeconomists who are willing to “play the game”, as it were. So DSGE might be an expensive way of proving that you’re willing to spend a lot of time and effort doing silly stuff that the profession tells you to do.

Noah Smith

NAIRU — a long-run equilibrium slippery eel

14 June, 2016 at 17:09 | Posted in Economics | 1 Comment

The concept of equilibrium, of course, is an indispensable tool of analysis … But to use the equilibrium concept one has to keep it in place, and its place is strictly in the preliminary stages of an analytical argument, not in the framing of hypotheses to be tested against the facts, for we know perfectly well that we shall not find facts in a state of equilibrium. Yet many writers seem to conceive the long-period as a date somewhere in the future that we shall get to some day …

210578Long-run equilibrium is a slippery eel. Marshall evidently intended to mean by the long period a horizon which is always at a certain distance in the future, and this is a useful metaphor; but he slips into discussing a position of equilibrium which is shifted by the very process of approaching it and he got himself into a thorough tangle by drawing three-dimensional positions on a plane diagram.

No one would deny that to speak of a tendency towards equilibrium that itself shifts the position towards which it is tending is a contradiction in terms. And yet it still persists. It is for this reason that we must attribute its survival to some kind of psychological appeal that transcends reason.


The existence of long-run equilibrium is a very handy modeling assumption to use. But that does not make it easily applicable to real-world economies. Why? Because it is basically a timeless concept utterly incompatible with real historical events. In the real world it is the second law of thermodynamics and historical — not logical — time that rules.

Is long-run equilibrium really a good guide to macroeconomic policy? Friedman’s NAIRUvian long run and the more strictly classical natural rate, based on rational expectations, are certainly beguiling. But are they relevant? Information may be asymmetric. Competition may be monopolistic. Nonlinearities and even chaos are possible. Equilibria may be multiple or continuous. In such cases, the long-run equilibrium may be undetermined or incalculable or beyond achievement. To put it another way, the future may be inherently unpredictable. Here, the political scientists with their concept of “rational ignorance” may have something to teach economists.

James K. Galbraith

Paul Krugman — mistaking the map for the territory

14 June, 2016 at 09:56 | Posted in Economics | 1 Comment

Paul Krugman has — together with Robin Wells — written an economics textbook that is used all over the world. As all the rest of mainstream economics textbooks, it stresses from the first pages the importance of supplying the student with a systematic way of thinking through economic problems with the help of simple models.

aaaaModeling is all about simplification …

A model is a simplified representation of reality that is used to better understand real-life situations …

The importance of models is that they allow economists to focus on the effects of only one change at a time …

For many purposes, the most effective form of economic modeling is the construction of ‘thought experiments’: simplified, hypothetical versions of real-life situations …

And these kind of rather vacuous ‘simplicity’ and ‘understanding’ statements get repeated — almost ad nauseam — over and over again in the book.

For someone genuinely interested in economic methodology and science theory it is definitely difficult to swallow Krugman’s methodological stance, and especially his non-problematized acceptance of the need for simple models.

To Krugman modeling is a logical way to analytically isolate different variables/causes/mechanisms operating in an economic system. Simplifying a complex world makes it possible for him to ‘tell a story’ about the economy.

Is not the use of abstractions a legitimate tool of economics? No doubt — it is only that all abstractions are not equally correct. An abstraction consists of isolating a part of reality, not in making it disappear.

Emile Durkheim

What is missing in Krugman’s model picture is an explanation of how and in what way his simplifications increase our understanding — and of what. If a model is good or bad is mostly not a question of simplicity, but rather if the assumptions on which it builds are valid and sound, or just something we choose, to make the model (mathematically) tractable.

Assumptions may make the model rigorous and consistent from a logical point of view, but that is of little avail if the consistency is bought at the price of not giving a truthful representation of the real economic system.

The model may not only be simple but oversimplified, making it quite unuseful for explanations and predictions.

The theories economists typically put forth about how the whole economy works are too simplistic.

George Akerlof & Robert Shiller

Throughout his discussion of models, Krugman assumes that they ‘allow economists to focus on the effects of only one change at a time.’ This assumption is of paramount importance and really ought to be much more argued for — on both epistemological and ontological grounds — if at all being used.

Limiting model assumptions in economic science always have to be closely examined since if we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we ‘export’ them to our ‘target systems,’ we have to be able to show that they do not only hold under ceteris paribus conditions and a fortiori only are of limited value to our understanding, explanations or predictions of real economic systems.

The rather one-sided emphasis on usefulness and its concomitant instrumentalist justification cannot hide that neither Krugman, nor the legions of other mainstream economics textbooks authors,  give supportive evidence for their considering it fruitful to believe in the possibility of analyzing complex and interrelated economic system ‘one part at a time.’ For although this atomistic hypothesis may have been useful in the natural sciences, it usually breaks down completely when applied to the social sciences. Dubious simplifying approximations do not take us one single iota closer to understanding or explaining open social and economic systems.

The kind of relations that Krugman and other mainstream economists establish with their ‘thought experimental’ modeling strategy are only relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made ‘nomological machines’ they are rare, or even non-existant. Unfortunately that also makes most of the mainstream modeling achievements rather useless.

All empirical sciences use simplifying or ‘unrealistic’ assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems. But models do not only face theory. They also have to look to the world. Being able to model a ‘credible world’ — Krugman’s ‘thought experiment’– a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

Some of the standard assumptions made in mainstream economic theory – on rationality, information handling and types of uncertainty – are not possible to make more realistic by ‘de-idealization’ or ‘successive approximations’ without altering the theory and its models fundamentally. And still there is not a single mentioning of this limitation in Krugman’s textbook!

From a methodological perspective, Krugman’s economic textbook — as are those of Mankiw et consortes — is a rather unimpressive attempt at legitimizing using fictitious idealizations for reasons more to do with model tractability than with a genuine interest of understanding and explaining features of real economies.

Krugman’s textbook and its simplicity preaching shows that mainstream economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of mainstream economists to match their deductive-axiomatic methods with their subject.

It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models — as long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live — is beyond my imagination. Sure, the simplicity that axiomatics and analytical arguments bring to economics is attractive to most economists, but simplicity obviously has its perils. Although simplicity is great when solving models, it’s quite another thing to assume that reality conforms to that tractability prerequisite.

Krugman’s and other mainstream economists’ textbooks are sad readings. Both theoretically and methodologically they are exponents of an ideology that seems to say that as long as theories and hypotheses are possible to transform into simple mathematical models, everything is just fine. As yours truly has tried to argue, there is actually no reason — other than pure hope — for believing this. The lack of methodological reflection in these books not only makes things wrong, but even worse, makes economics absolutely irrelevant when it comes to explaining and understanding real economies.

The anatomy of stock market bubbles

14 June, 2016 at 09:28 | Posted in Economics | 1 Comment


NAIRU religion

13 June, 2016 at 09:45 | Posted in Economics | 1 Comment

Having concluded seven years as chief economist at IMF, Olivier Blanchard is now considering rewriting his undergraduate macroeconomics textbook:

How should we teach macroeconomics to undergraduates after the crisis? Here are some of my conclusions …

Turning to the supply side, the contraption known as the aggregate demand–aggregate supply model should be eliminated. It is clunky and, for good reasons, undergraduates find it difficult to understand … These difficulties are avoided if one simply uses a Phillips Curve (PC) relation to characterize the supply side. Potential output, or equivalently, the natural rate of unemployment, is determined by the interaction between wage setting and price setting. Output above potential, or unemployment below the natural rate, puts upward pressure on inflation. The nature of the pressure depends on the formation of expectations, an issue central to current developments. jobs%20cartoonIf people expect inflation to be the same as in the recent past, pressure takes the form of an increase in the inflation rate. If people expect inflation to be roughly constant as seems to be the case today, then pressure takes the form of higher—rather than increasing—inflation. What happens to the economy, whether it returns to its historical trend, then depends on how the central bank adjusts the policy rate in response to this inflation pressure.

Hmm …

One of the main problems with NAIRU — what Blanchard is referring to as ‘the natural rate of unemployment’ — is that if it is essentially seen as a timeless long-run equilibrium attractor to which actual unemployment (allegedly) has to adjust, then if that equilibrium is itself changing — and in ways that depend on the process of getting to the equilibrium — well, then we can’t really be sure what that equlibrium will be without contextualizing unemployment in real historical time. And when we do, we will see how seriously wrong we go if we omit demand from the analysis. Demand policy has long-run effects and matters also for structural unemployment — and governments and central banks can’t just look the other way and legitimize their passivity re unemployment by refering to NAIRU.

NAIRU does not hold water simply because it does not exist, and to base economic policy — or textbook models — on such a weak theoretical and empirical construct is nothing short of writing out a prescription for self-inflicted economic havoc.

According to the  [NAIRU theory], unemployment differs from its natural rate only if expected inflation differs from actual inflation. If expectations are rational, we should see as many quarters when inflation is above expected inflation as quarters when it is below expected inflation. That suggests the following test of the [NAIRU theory]..

74-7495-LTNQ100ZBecause a decade contains 40 quarters, the probability that average expected inflation over a decade will be different from naverage actual inflation should be small. If the [NAIRU theory] and rational expectations are both true simultaneously, a plot of decade averages of inflation against unemployment should reveal a vertical line at the natural rate of unemployment … This prediction fails dramatically.

There is no tendency for the points to lie around a vertical line and, if anything, the long-run Phillips is upward sloping, and closer to being horizontal than vertical. Since it is unlikely that expectations are systematically biased over decades, I conclude that the  [NAIRU theory] is false.

Defenders of the [NAIRU theory] might choose to respond to these empirical findings by arguing that the natural rate of unemployment is time varying. But I am unaware of any theory which provides us, in advance, with an explanation of how the natural rate of unemployment varies over time. In the absence of such a theory the [NAIRU theory] has no predictive content. A theory like this, which cannot be falsified by any set of observations, is closer to religion than science.

Roger Farmer

Is ‘Cauchy logic’ applicable to economics?

12 June, 2016 at 11:26 | Posted in Statistics & Econometrics | Comments Off on Is ‘Cauchy logic’ applicable to economics?

What is 0.999 …, really?

It appears to refer to a kind of sum:

.9 + + 0.09 + 0.009 + 0.0009 + …

9781594205224M1401819961But what does that mean? That pesky ellipsis is the real problem. There can be no controversy about what it means to add up two, or three, or a hundred numbers. But infinitely many? That’s a different story. In the real world, you can never have infinitely many heaps. What’s the numerical value of an infinite sum? It doesn’t have one — until we give it one. That was the great innovation of Augustin-Louis Cauchy, who introduced the notion of limit into calculus in the 1820s.

The British number theorist G. H. Hardy … explains it best: “It is broadly true to say that mathematicians before Cauchy asked not, ‘How shall we define 1 – 1 – 1 + 1 – 1 …’ but ‘What is 1 -1 + 1 – 1 + …?'”

No matter how tight a cordon we draw around the number 1, the sum will eventually, after some finite number of steps, penetrate it, and never leave. Under those circumstances, Cauchy said, we should simply define the value of the infinite sum to be 1.

I have no problem with solving problems in mathematics by defining them away. But how about the real world? Maybe that ought to be a question to consider even for economists all to fond of uncritically following the mathematical way when applying their models to the real world, where indeed ‘you can never have infinitely many heaps’ …

In econometrics we often run into the ‘Cauchy logic’ —the data is treated as if it were from a larger population, a ‘superpopulation’ where repeated realizations of the data are imagined. Just imagine there could be more worlds than the one we live in and the problem is fixed …

Accepting Haavelmo’s domain of probability theory and sample space of infinite populations – just as Fisher’s ‘hypothetical infinite population,’ of which the actual data are regarded as constituting a random sample, von Mises’s ‘collective’ or Gibbs’s ‘ensemble’ – also implies that judgments are made on the basis of observations that are actually never made!

Infinitely repeated trials or samplings never take place in the real world. So that cannot be a sound inductive basis for a science with aspirations of explaining real-world socio-economic processes, structures or events. It’s — just as the Cauchy mathematical logic of defining away problems — not tenable.

In economics it’s always wise to remember C. S. Peirce’s remark that universes are not as common as peanuts …

Via con me

12 June, 2016 at 10:37 | Posted in Varia | Comments Off on Via con me


Ergodicity and the wrong way to calculate expectations (wonkish)

11 June, 2016 at 15:54 | Posted in Economics | Comments Off on Ergodicity and the wrong way to calculate expectations (wonkish)

If there was one thing I believed was a reasonable implicit assumption of economics, it was determining the expectation value upon which agents base their decisions as the “ensemble mean” of a large number of draws from a distribution …

But now I’m not so sure …

pT7raRkecRolling a dice is a good example. The expected distribution of outcomes from rolling a single dice in a 10,000 roll sequence is the same as the expected distribution of rolling 10,000 dice once each. That process is ergodic.

But many processes are not like this. You cannot just keep playing over time and expect to converge to the mean …

You start with a $100 balance. You flip a coin. Heads means you win 50% of your current balance. Tails means you lose 40%. Then repeat.

Taking the ensemble mean entails reasoning by way of imagining a large number coin flips at each time period and taking the mean of these fictitious flips. That means the expectation value based on the ensemble mean of the first coin toss is (0.5x$50 + 0.5*$-40) = $5, or a 5% gain. Using this reasoning, the expectation for the second sequential coin toss is (0.5*52.5 + 0.5 * $-42) = $5.25 (another 5% gain).

The ensemble expectation is that this process will generate a 5% compound growth rate over time.

But if I start this process and keep playing long enough over time, I will never converge to that 5% expectation. The process is non-ergodic …

In fact, out of the 20,000 runs in my simulation, 17,000 lost money over the 100 time periods, having a final balance less than their $100 starting balance. Even more starkly, more than half the runs had less than $1 after 100 time periods …

So if almost everybody losses from this process, how can the ensemble mean of 5% compound growth be a reasonable expectation value? It cannot. For someone who is only going to experience a single path through a non-ergodic process, basing your behaviour on an expectation using the ensemble mean probably won’t be an effective way to navigate economic variations.

Cameron Murray

Cameron Murray is absolute right — and the issue of ergodicity is of fundamental importance in economics to have a clear view on.

flipping-a-coin-gives-you-the-truth-of-the-matter-21350026Let’s say we have a stationary process. That does not guarantee that it is also ergodic. The long-run time average of a single output function of the stationary process may not converge to the expectation of the corresponding variables — and so the long-run time average may not equal the probabilistic (expectational) average. Say we have two coins, where coin A has a probability of 1/2 of coming up heads, and coin B has a probability of 1/4 of coming up heads. We pick either of these coins with a probability of 1/2 and then toss the chosen coin over and over again. Now let H1, H2, … be either one or zero as the coin comes up heads or tales. This process is obviously stationary, but the time averages — [H1 + … + Hn]/n — converges to 1/2 if coin A is chosen, and 1/4 if coin B is chosen. Both these time averages have a probability of 1/2 and so their expectational average is 1/2 x 1/2 + 1/2 x 1/4 = 3/8, which obviously is not equal to 1/2 or 1/4. The time averages depend on which coin you happen to choose, while the probabilistic (expectational) average is calculated for the whole “system” consisting of both coin A and coin B.

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

Ergodicity and the all-important difference between time averages and ensemble averages are difficult concepts that many students of economics have problems with understanding. So let me just try to give yet one other explanation of the meaning of these concepts by means of a couple of simple examples.

Let’s say you’re offered a gamble where on a roll of a fair die you will get €10  billion if you roll a six, and pay me €1 billion if you roll any other number.

Would you accept the gamble?

If you’re an economics students you probably would, because that’s what you’re taught to be the only thing consistent with being rational. You would arrest the arrow of time by imagining six different “parallel universes” where the independent outcomes are the numbers from one to six, and then weight them using their stochastic probability distribution. Calculating the expected value of the gamble – the ensemble average – by averaging on all these weighted outcomes you would actually be a moron if you didn’t take the gamble (the expected value of the gamble being 5/6*€0 + 1/6*€10 billion = €1.67 billion)

If you’re not an economist you would probably trust your common sense and decline the offer, knowing that a large risk of bankrupting one’s economy is not a very rosy perspective for the future. Since you can’t really arrest or reverse the arrow of time, you know that once you have lost the €1 billion, it’s all over. The large likelihood that you go bust weights heavier than the 17% chance of you becoming enormously rich. By computing the time average – imagining one real universe where the six different but dependent outcomes occur consecutively – we would soon be aware of our assets disappearing, and a fortiori that it would be irrational to accept the gamble.

[From a mathematical point of view you can  (somewhat non-rigorously) describe the difference between ensemble averages and time averages as a difference between arithmetic averages and geometric averages. Tossing a fair coin and gaining 20% on the stake (S) if winning (heads) and having to pay 20% on the stake (S) if loosing (tails), the arithmetic average of the return on the stake, assuming the outcomes of the coin-toss being independent, would be [(0.5*1.2S + 0.5*0.8S) – S)/S]  = 0%. If considering the two outcomes of the toss not being independent, the relevant time average would be a geometric average return of  squareroot[(1.2S *0.8S)]/S – 1 = -2%.]

Why is the difference between ensemble and time averages of such importance in economics? Well, basically, because when assuming the processes to be ergodic,ensemble and time averages are identical.

Assume we have a market with an asset priced at €100 . Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be €100 – because we here envision two parallel universes (markets) where the assetprice falls in one universe (market) with 50% to €50, and in another universe (market) it goes up with 50% to €150, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset price first rises by 50% to €150, and then falls by 50% to €75 (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen. Assuming ergodicity there would have been no difference at all.

The difference between ensemble and time averages also highlights — as Murray’s post shows — the problems concerning the neoclassical theory of expected utility (something I have touched upon e. g. in Why expected utility theory is wrong).

When applied to the neoclassical theory of expected utility, one thinks in terms of “parallel universe” and asks what is the expected return of an investment, calculated as an average over the “parallel universe”? In our coin tossing example, it is as if one supposes that various “I” are tossing a coin and that the loss of many of them will be offset by the huge profits one of these “I” does. But this ensemble average does not work for an individual, for whom a time average better reflects the experience made in the “non-parallel universe” in which we live.

Time averages gives a more realistic answer, where one thinks in terms of the only universe we actually live in, and ask what is the expected return of an investment, calculated as an average over time.

Since we cannot go back in time – entropy and the arrow of time make this impossible – and the bankruptcy option is always at hand (extreme events and “black swans” are always possible) we have nothing to gain from thinking in terms of ensembles.

Actual events follow a fixed pattern of time, where events are often linked in a multiplicative process (as e. g. investment returns with “compound interest”) which is basically non-ergodic.

Instead of arbitrarily assuming that people have a certain type of utility function – as in the neoclassical theory – time average considerations show that we can obtain a less arbitrary and more accurate picture of real people’s decisions and actions by basically assuming that time is irreversible. When are assets are gone, they are gone. The fact that in a parallel universe it could conceivably have been refilled, are of little comfort to those who live in the one and only possible world that we call the real world.

Our coin toss example can be applied to more traditional economic issues. If we think of an investor, we can basically describe his situation in terms of our coin toss. What fraction of his assets should an investor – who is about to make a large number of repeated investments – bet on his feeling that he can better evaluate an investment (p = 0.6) than the market (p = 0.5)? The greater the fraction, the greater is the leverage. But also – the greater is the risk. Letting p be the probability that his investment valuation is correct and (1 – p) is the probability that the market’s valuation is correct, it means that he optimizes the rate of growth on his investments by investing a fraction of his assets that is equal to the difference in the probability that he will “win” or “lose”. This means that he at each investment opportunity (according to the so called Kelly criterion) is to invest the fraction of  0.6 – (1 – 0.6), i.e. about 20% of his assets (and the optimal average growth rate of investment can be shown to be about 2% (0.6 log (1.2) + 0.4 log (0.8))).

Time average considerations show that because we cannot go back in time, we should not take excessive risks. High leverage increases the risk of bankruptcy. This should also be a warning for the financial world, where the constant quest for greater and greater leverage – and risks – creates extensive and recurrent systemic crises. A more appropriate level of risk-taking is a necessary ingredient in a policy to come to curb excessive risk taking.

Keynes in Finland

11 June, 2016 at 12:10 | Posted in Economics | Comments Off on Keynes in Finland


s-l300Visiting one of Helsinki’s many nice cafés and restaurants the other day, I read the following inscription on a mirror and thought Keynes must have been here …



syllhelsinkiAnyhow — slides from yours truly’s keynote presentation at the Kalevi Sorsa Foundation celebration of the 80th anniversary of Keynes’ General Theory is available here.

The Spirit of Tallis

11 June, 2016 at 09:54 | Posted in Economics | Comments Off on The Spirit of Tallis


Heavenly beautiful!

Flimflam Chicago economics

9 June, 2016 at 09:21 | Posted in Economics | Comments Off on Flimflam Chicago economics

flimflamThe people inside the model have much more knowledge about the system they are operating in than is available to the economist or econometrician who is using the model to try to understand their behavior. In particular, an econometrician faces the problem of estimating probability distributions and laws of motion that the agents in the model are assumed to know. Further the formal estimation and inference procedures of rational expectations econometricians assumes that the agents in the model already know many of the objects the econometrician is estimating.

Thomas Sargent

Making utterly ridiculous restrictive model assumptions on rationality, information, and cognition, makes it possible for Chicago economists to describe the world as an instantiation of a FORTRAN program.

Yes, indeed. And at asylums there are people who think they are Napoleon and that the moon is made of green cheese …

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.