Economics textbooks — how to get away with scientific fraud

29 November, 2013 at 14:19 | Posted in Economics | 10 Comments

fraud-kit

As is well-known, Keynes used to criticize the more traditional economics for making the fallacy of composition, which basically consists of the false belief that the whole is nothing but the sum of its parts. Keynes argued that in the society and in the economy this was not the case, and that a fortiori an adequate analysis of society and economy couldn’t proceed by just adding up the acts and decisions of individuals. The whole is more than a sum of parts.
This fact shows up already when orthodox – neoclassical – economics tries to argue for the existence of The Law of Demand – when the price of a commodity falls, the demand for it will increase – on the aggregate. Although it may be said that one succeeds in establishing The Law for single individuals it soon turned out – in the Sonnenschein-Mantel-Debreu theorem firmly established already in 1976 – that it wasn’t possible to extend The Law of Demand to apply on the market level, unless one made ridiculously unrealistic assumptions such as individuals all having homothetic preferences – which actually implies that all individuals have identical preferences.

This could only be conceivable if there was in essence only one actor – the (in)famous representative actor. So, yes, it was possible to generalize The Law of Demand – as long as we assumed that on the aggregate level there was only one commodity and one actor. What generalization! Does this sound reasonable? Of course not. This is pure nonsense!

How has neoclassical economics reacted to this devastating findig? Basically by looking the other way, ignoring it and hoping that no one sees that the emperor is naked.

Having gone through a handful of the most frequently used textbooks of economics at the undergraduate level today, I can only conclude that the models that are presented in these modern neoclassical textbooks try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent.9780538453059
That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

So what modern economics textbooks present to students are really models built on the assumption that an entire economy can be modeled as a representative actor and that this is a valid procedure. But it isn’t — as the Sonnenschein-Mantel-Debreu theorem irrevocably has shown.

Of course one could say that it is too difficult on undergraduate levels to show why the procedure is right and to defer it to masters and doctoral courses. It could justifiably be reasoned that way – if what you teach your students is true, if The Law of Demand is generalizable to the market level and the representative actor is a valid modeling abstraction! But in this case it’s demonstrably known to be false, and therefore this is nothing but a case of scandalous intellectual dishonesty. It’s like telling your students that 2 + 2 = 5 and hope that they will never run into Peano’s axioms of arithmetics.

Or — just to take another example — let’s see how the important macroeconomic question of wage rigidity is treated.

Among a couple of really good intermediate – neoclassical – macroeconomics textbooks, Chad Jones textbook Macroeconomics (2nd ed, W W Norton, 2011) stands out as perhaps one of the better alternatives. Unfortunately it also contains some utter nonsense!

In chapter 7 – on “The Labor Market, Wages, and Unemployment” – Jones writes (p. 179):

The point of this experiment is to show that wage rigidities can lead to large movements in employment. Indeed, they are the reason John Maynard Keynes gave, in The General Theory of Employment, Interest, and Money (1936), for the high unemployment of the Great Depression.

But this is pure nonsense. For although Keynes in General Theory devoted substantial attention to the subject of wage rigidities, he certainly did not hold the view that wage rigidities were “the reason … for the high unemployment of the Great Depression.”

Since unions/workers, contrary to classical assumptions, make wage-bargains in nominal terms, they will – according to Keynes – accept lower real wages caused by higher prices, but resist lower real wages caused by lower nominal wages. However, Keynes held it incorrect to attribute “cyclical” unemployment to this diversified agent behaviour. During the depression money wages fell significantly and – as Keynes noted – unemployment still grew. Thus, even when nominal wages are lowered, they do not generally lower unemployment.

In any specific labour market, lower wages could, of course, raise the demand for labour. But a general reduction in money wages would leave real wages more or less unchanged. The reasoning of the classical economists was, according to Keynes, a flagrant example of the “fallacy of composition.” Assuming that since unions/workers in a specific labour market could negotiate real wage reductions via lowering nominal wages, unions/workers in general could do the same, the classics confused micro with macro.

Lowering nominal wages could not – according to Keynes – clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. But to Keynes it would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen by Keynes as a general substitute for an expansionary monetary or fiscal policy.

Even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

So, what Keynes actually did argue in General Theory, was that the classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong.

Where Keynes found it unproblematic to link flexible wages and prices to involuntary unemployment, modern “Keynesian” macroeconomists has turned his theory into different kinds of fix-price models. But to Keynes, flexible wages would only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labor market.

So — for almost forty years neoclassical economics has lived with a theorem that shows the impossibility of extending the microanalysis of consumer behaviour to the macro level (unless making patently and admittedly unrealistic and absurd assumptions). Still after all these years neoclassical economists pretend in their textbooks that this theorem does not exist. Most textbooks  don’t even mention the existence of the Sonnenschein-Mantel-Debreu theorem. And when it comes to Keynes and wage rigidities, Jones’s macroeconomics textbook is not the only one containing the kind of utter nonsense we’ve mentioned.  But here the solution to the problem is more easy. Keynes books are still in print. Read them.

The real scientific challenge — that also has to be reflected in textbooks — is to accept uncertainty and still try to explain why economic transactions take place — instead of simply conjuring the problem away by assuming rational expectations, representative actors, universal market clearing  and treating uncertainty as if it was possible to reduce it to stochastic risk. That is scientific fraud. And it has been going on for too long now.

Advertisements

Monastery of La Rabida

28 November, 2013 at 19:18 | Posted in Varia | Comments Off on Monastery of La Rabida

 

Wren-Lewis and the chicken defence of rational expectations

28 November, 2013 at 17:11 | Posted in Economics | 4 Comments

chickenCommenting on the critique of his earlier attempt at defending rational expectations, Simon Wren-Lewis has come up with a new line of defense — chickens:

The chicken that is fed by the farmer each morning may well have a theory that it will always be fed each morning – it becomes a ‘law’. And it works every day, until the day the chicken is instead slaughtered …

Now you might say that no chicken is an economist, but suppose that chickens were as intelligent as the farmer who keeps them, so they could be an economist … So if (the)  chicken had been an economist, they would not simply have observed that every morning the farmer brought them food, and therefore concluded that this must happen forever. Instead they would have asked a crucial additional question: why is the farmer doing this? … And of course trying to answer that question might have led them to the unfortunate truth …

You can see why the habit of introspection would make economists predisposed to assume rationality generally, and rational expectations in particular … It only works to use your own thought processes as a guide to how people in general might behave, if you think other people are essentially like yourself. So if your own thoughts lead you to postulate some theory about how the economy behaves, then others similar to yourself might be able to do something like the same thing …

Economists may also be fooled into thinking their introspection is representative, because they are surrounded by other economists. So this conjecture about introspection does little to show that assuming agents have rational expectations is right (or wrong), but it may be one reason why most economists find the concept of rational expectations so attractive.

This is actually the second example yours truly has come across this month where a mainstream economist uses story-telling trying to defend rational expectations.

Earlier this month Mark Thoma — in an article in The Fiscal Times  — argued like this (emphasis added):

The rationality assumption is reasonable in some cases. For example, even young children have rational expectations in the sense that economists use the term. Think, for example, of a game where a parent is tickling a child and following a fixed rule. Tickle the armpit, tickle the knee, tickle the armpit, tickle the knee, and so on in a repeating pattern.

If the child is following the simplest type of adaptive expectations in trying to cover up and avoid being tickled, i.e. expect whatever happened last period, he or she will always be one step behind and will never block a tickle. But if the child understands the rule the parent is following and also fully understands the nature of the game, it is easy to rationally anticipate where the next tickle attempt will be and take evasive action.

Although there is some healthy skepticism on rational expectations  in both Wren-Lewis’s and Thoma’s storytelling, I still think that their picture of the extent to which the assumption of rational expectations is useful and valid, is inadequate and unwarranted.

When John Muth first developed the concept of rational expectations — in  an Econometrica article in 1961 — he framed rational expectations in terms of probability distributions:

Expectations of firms (or, more generally, the subjective probability distribution of outcomes) tend to be distributed, for the same information set, about the prediction of the theory (or the “objective” probability distributions of outcomes).

To Muth the hypothesis of rational expectations was useful because it was general and applicable to all sorts of situations, irrespective of the concrete and contingent circumstances at hand. The concept was later picked up by New Classical Macroeconomics, where it soon became the dominant model-assumption and has continued to be a standard assumption made in many neoclassical (macro)economic models – most notably in the fields of (real) business cycles and finance (being a cornerstone of the “efficient market hypothesis”).

The rational expectations hypothesis basically says that people on the average hold expectations that will be fulfilled — which of course makes the economist’s analysis enormously simple, since it means that the model used by the economist is the same as the one people use to make decisions and forecasts of the future.

The perhaps most problematic part of Wren-Lewis’s and Thoma’s argument is that they both maintain that chicken-economists and young children (emphasis added)

have rational expectations in the sense that economists use the term.

This is the heart of darkness. Of course Wren-Lewis’s and Thoma’s portrayal of the economists’s meaning of rational expectations — which actually is far from real people’s common sensical meaning — is not as innocent as it may look. Rational expectations in the neoclassical economists’s world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It’s really straining one’s imagination trying to see any similarity between these modelling assumptions and the expectations of rational chicken-economists or children playing the “tickling game.” In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel.

So — the rational expectations modeled by economists are not at all the kind of expectations real youngsters — and chickens — have. That also means that — if we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make — we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations. And that goes for chickens and children too …

Krugman disses students that want to rethink economics

27 November, 2013 at 19:12 | Posted in Economics | 14 Comments

Paul Krugman today rides out — like his brother in arms, Simon Wren-Lewis — to defend mainstream economics. According to Krugman, yours truly and others of that ilk are wrong in blaming mainstream economics for not being relevant and not being able to foresee the crisis. To Krugman there is nothing wrong with “standard theory” and “economics textbooks.” If only policy makers and economists stick to “standard economic analysis” everything would be just fine.

I’ll be dipped! If there’s anything the last five years have shown us, it is that economists have gone astray in their shed of tools. Krugman’s “standard theory” — neoclassical economics – has contributed to causing todays’s economic crisis rather than to solving it.
Rethinking econ_0Reading Krugman’s post, I guess a lot of the young economics students in UK and US that today are looking for alternatives to the autistic mainstream neoclassical theory are deeply disappointed. Rightly so. But — although Krugman, especially on his blog, certainly tries to present himself as a kind of radical and anti-establishment economics guy — when it really counts, he shows what he is —  a die-hard teflon-coated neoclassical economist.

Perhaps this becomes less perplexing to grasp when one considers what Krugman said already in 1996, when he was invited to speak to the European Association for Evolutionary Political Economy. So here – right from the horse’s mouth – I quote from the speech (emphasis added):

I like to think that I am more open-minded about alternative approaches to economics than most, but I am basically a maximization-and-equilibrium kind of guy. Indeed, I am quite fanatical about defending the relevance of standard economic models in many situations.

I won’t say that I am entirely happy with the state of economics. But let us be honest: I have done very well within the world of conventional economics. I have pushed the envelope, but not broken it, and have received very widespread acceptance for my ideas. What this means is that I may have more sympathy for standard economics than most of you. My criticisms are those of someone who loves the field and has seen that affection repaid. I don’t know if that makes me morally better or worse than someone who criticizes from outside, but anyway it makes me different.

To me, it seems that what we know as economics is the study of those phenomena that can be understood as emerging from the interactions among intelligent, self-interested individuals. Notice that there are really four parts to this definition. Let’s read from right to left.

1. Economics is about what individuals do: not classes, not “correlations of forces”, but individual actors. This is not to deny the relevance of higher levels of analysis, but they must be grounded in individual behavior. Methodological individualism is of the essence.

2. The individuals are self-interested. There is nothing in economics that inherently prevents us from allowing people to derive satisfaction from others’ consumption, but the predictive power of economic theory comes from the presumption that normally people care about themselves.

3. The individuals are intelligent: obvious opportunities for gain are not neglected. Hundred-dollar bills do not lie unattended in the street for very long.

4. We are concerned with the interaction of such individuals: Most interesting economic theory, from supply and demand on, is about “invisible hand” processes in which the collective outcome is not what individuals intended.

Personally, I consider myself a proud neoclassicist. By this I clearly don’t mean that I believe in perfect competition all the way. What I mean is that I prefer, when I can, to make sense of the world using models in which individuals maximize and the interaction of these individuals can be summarized by some concept of equilibrium. The reason I like that kind of model is not that I believe it to be literally true, but that I am intensely aware of the power of maximization-and-equilibrium to organize one’s thinking – and I have seen the propensity of those who try to do economics without those organizing devices to produce sheer nonsense when they imagine they are freeing themselves from some confining orthodoxy.

So now all you young economics students that want to see a real change in economics and the way it’s taught — now you know where you have Krugman & Co. If you really want something other than the same old neoclassical catechism, if you really don’t want to be force-fed with neoclassical mumbo jumbo, you have to look elsewhere. 

No reality, please. We’re economists!

26 November, 2013 at 18:22 | Posted in Theory of Science & Methodology | 4 Comments

Ever since the Enlightenment various economists had been seeking to mathematise the study of the economy. In this, at least prior to the early years of the twentieth century, economists keen to mathematise their discipline felt constrained in numerous ways, and not least by pressures by (non-social) natural scientists and influential peers to conform to the ‘standards’ and procedures of (non-social) natural science, and thereby abandon any idea of constructing an autonomous tradition of mathematical economics. Especially influential, in due course, was the classical reductionist programme, the idea that all mathematical disciplines should be reduced to or based on the model of physics, in particular on the strictly deterministic approach of mechanics, with its emphasis on methods of infinitesimal calculus …

mathpic11However, in the early part of the twentieth century changes occurred in the inter-pretation of the very nature of mathe-matics, changes that caused the classical reductionist programme itself to fall into disarray. With the development of relativity theory and especially quantum theory, the image of nature as continuous came to be re-examined in particular, and the role of infinitesimal calculus, which had previously been regarded as having almost ubiquitous relevance within physics, came to be re-examined even within that domain.

The outcome, in effect, was a switch away from the long-standing emphasis on mathematics as an attempt to apply the physics model, and specifically the mechanics metaphor, to an emphasis on mathematics for its own sake.

Mathematics, especially through the work of David Hilbert, became increasingly viewed as a discipline properly concerned with providing a pool of frameworks for possible realities. No longer was mathematics seen as the language of (non-social) nature, abstracted from the study of the latter. Rather, it was conceived as a practice concerned with formulating systems comprising sets of axioms and their deductive consequences, with these systems in effect taking on a life of their own. The task of finding applications was henceforth regarded as being of secondary importance at best, and not of immediate concern.

ffullbrookThis emergence of the axiomatic method removed at a stroke various hitherto insurmountable constraints facing those who would mathematise the discipline of economics. Researchers involved with mathematical projects in economics could, for the time being at least, postpone the day of interpreting their preferred axioms and assumptions. There was no longer any need to seek the blessing of mathematicians and physicists or of other economists who might insist that the relevance of metaphors and analogies be established at the outset. In particular it was no longer regarded as necessary, or even relevant, to economic model construction to consider the nature of social reality, at least for the time being. Nor, it seemed, was it possible for anyone to insist with any legitimacy that the formulations of economists conform to any specific model already found to be successful elsewhere (such as the mechanics model in physics). Indeed, the very idea of fixed metaphors or even interpretations, came to be rejected by some economic ‘modellers’ (albeit never in any really plausible manner).

The result was that in due course deductivism in economics, through morphing into mathematical deductivism on the back of developments within the discipline of mathematics, came to acquire a new lease of life, with practitioners (once more) potentially oblivious to any inconsistency between the ontological presuppositions of adopting a mathematical modelling emphasis and the nature of social reality. The consequent rise of mathematical deductivism has culminated in the situation we find today.

Tony Lawson

Is economics a science?

25 November, 2013 at 20:41 | Posted in Economics | 10 Comments

As yours truly has reported repeatedly during the last months, university students all over Europe are increasingly beginning to question if the kind of economics they are taught — mainstream neoclassical economics — really is of any value. Some have even started to question if economics really is a science.

jmk

Two Nobel laureates in economics — Robert Shiller and Paul Krugman — have lately tried to respond.

This is Robert Shiller‘s answer:

I am one of the winners of this year’s Nobel Memorial Prize in Economic Sciences, which makes me acutely aware of criticism of the prize by those who claim that economics – unlike chemistry, physics, or medicine, for which Nobel Prizes are also awarded – is not a science. Are they right? …

The problem is that once we focus on economic policy, much that is not science comes into play. Politics becomes involved, and political posturing is amply rewarded by public attention. The Nobel Prize is designed to reward those who do not play tricks for attention, and who, in their sincere pursuit of the truth, might otherwise be slighted …

Critics of “economic sciences” sometimes refer to the development of a “pseudoscience” of economics, arguing that it uses the trappings of science, like dense mathematics, but only for show. For example, in his 2004 book Fooled by Randomness, Nassim Nicholas Taleb said of economic sciences: “You can disguise charlatanism under the weight of equations, and nobody can catch you since there is no such thing as a controlled experiment” …

My belief is that economics is somewhat more vulnerable than the physical sciences to models whose validity will never be clear, because the necessity for approximation is much stronger than in the physical sciences, especially given that the models describe people rather than magnetic resonances or fundamental particles. People can just change their minds and behave completely differently. They even have neuroses and identity problems, complex phenomena that the field of behavioral economics is finding relevant to understanding economic outcomes.

But all the mathematics in economics is not, as Taleb suggests, charlatanism. Economics has an important quantitative side, which cannot be escaped. The challenge has been to combine its mathematical insights with the kinds of adjustments that are needed to make its models fit the economy’s irreducibly human element.

The advance of behavioral economics is not fundamentally in conflict with mathematical economics, as some seem to think, though it may well be in conflict with some currently fashionable mathematical economic models. And, while economics presents its own methodological problems, the basic challenges facing researchers are not fundamentally different from those faced by researchers in other fields. As economics develops, it will broaden its repertory of methods and sources of evidence, the science will become stronger, and the charlatans will be exposed.

Robert Shiller

And this is what Paul Krugman says on the question:

[S]omething is deeply wrong with economics. While economists using textbook macro models got things mostly and impressively right, many famous economists refused to use those models — in fact, they made it clear in discussion that they didn’t understand points that had been worked out generations ago. Moreover, it’s hard to find any economists who changed their minds when their predictions, say of sharply higher inflation, turned out wrong.

Nor is this a new thing. My take on the history of macro is that the notion of equilibrium business cycles had, by the standards of any normal science, definitively failed by any normal scientific standard by 1990 at the latest … Yet there was no backing off on this approach. On the contrary, it actually increased its hold on the profession.

So, let’s grant that economics as practiced doesn’t look like a science. But that’s not because the subject is inherently unsuited to the scientific method. Sure, it’s highly imperfect — it’s a complex area, and our understanding is in its early stages. And sure, the economy itself changes over time, so that what was true 75 years ago may not be true today …

No, the problem lies not in the inherent unsuitability of economics for scientific thinking as in the sociology of the economics profession — a profession that somehow, at least in macro, has ceased rewarding research that produces successful predictions and rewards research that fits preconceptions and uses hard math instead.

Paul Krugman

My own take on the issue is that economics – and especially mainstream neoclassical economics – has as a science lost immensely in terms of status and prestige during the last years. Not the least because of its manifest inability to foresee the latest financial and economic crisis – and its lack of constructive and sustainable policies to take us out of the crisis.

ab

We all know that many activities, relations, processes and events are uncertain and that the data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

Neoclassical economists, however, have wanted to use their hammer, and so decided to pretend that the world looks like a nail. Pretending that uncertainty can be reduced to risk and construct models on that assumption have only contributed to financial crises and economic havoc.

How do we put an end to this intellectual cataclysm? How do we re-establish credence and trust in economics as a science? Five changes are absolutely decisive.

(1) Stop pretending that we have exact and rigorous answers on everything. Because we don’t. We build models and theories and tell people that we can calculate and foresee the future. But we do this based on mathematical and statistical assumptions that often have little or nothing to do with reality. By pretending that there is no really important difference between model and reality we lull people into thinking that we have things under control. We haven’t! This false feeling of security was one of the factors that contributed to the financial crisis of 2008.

(2) Stop the childish and exaggerated belief in mathematics giving answers to important economic questions. Mathematics gives exact answers to exact questions. But the relevant and interesting questions we face in the economic realm are rarely of that kind. Questions like “Is 2 + 2 = 4?” are never posed in real economies. Instead of a fundamentally misplaced reliance on abstract mathematical-deductive-axiomatic models having anything of substance to contribute to our knowledge of real economies, it would be far better if we pursued “thicker” models and relevant empirical studies and observations.

(3) Stop pretending that there are laws in economics. There are no universal laws in economics. Economies are not like planetary systems or physics labs. The most we can aspire to in real economies is establishing possible tendencies with varying degrees of generalizability.

(4) Stop treating other social sciences as poor relations. Economics has long suffered from hubris. A more broad-minded and multifarious science would enrich today’s altogether too autistic economics.

(5) Stop building models and making forecasts of the future based on totally unreal micro-founded macromodels with intertemporally optimizing robot-like representative actors equipped with rational expectations. This is pure nonsense. We have to build our models on assumptions that are not so blatantly in contradiction to reality. Assuming that people are green and come from Mars is not a good – not even as a “successive approximation” – modeling strategy.

“Free markets” — a poisonous term

25 November, 2013 at 18:57 | Posted in Economics | Comments Off on “Free markets” — a poisonous term

 

Svensk högskola i utförsbacke

25 November, 2013 at 10:11 | Posted in Education & School | Comments Off on Svensk högskola i utförsbacke

Spelar utbredd slapphet någon roll? Tja, om meningen med högre utbildning är att bidra till en intellektuell och yrkesmässig kvalificering av befolkningen och ge studenterna en kognitiv skjuts uppåt krävs nog kursinnehåll och arbetsinsatser som i alla fall för normalstudenten – illa förberedd av svaga grund- och gymnasieskolor – skulle motivera snarare över 40 än dryga 20 timmars studieinsats per vecka.

acadNyligen publicerades en amerikansk studie, ”Academically Adrift”, som visade att cirka 40 procent av alla högskolestuderande ej förbättrades intellektuellt av sin utbildning. Man kan ifrågasätta om denna är meningsfull för alla dessa. De lär knappast få arbeten i linje med vad deras utbildningsbevis förespeglar. Läget är knappast mycket bättre i Sverige …

Många universitets- och högskoleledningar tycks mest intresserade av genomströmning och av att få allt att se formellt bra ut i förhållande till UKÄ och andra intressenter. Samt att undvika konflikter. Som en före detta rektor jag pratade med sade, så var man medveten om svaga studieinsatser inom många ämnen och institutioner men det gjorde man inget åt – för känsligt att ta tag i, var motivet. Låt-gå-mentalitet odlas …

Det sagda betyder naturligtvis inte att nöjda studenter, resurser, bra pedagogik eller arbetslivsrelevans skall förringas. Dessa är självklart viktiga men (över-) betonas redan starkt. Och nöjdhet kan ej kompensera för högskolans kärnuppgift: att utveckla intellektuell förmåga. Och det kräver ett inte alltför lättuggat innehåll och en hel del arbete. De högskoleledningar och -lärare som inte kan eller vill ta ansvar för detta bör kanske göra något annat i stället.

Allt fler genomskådar troligen högskolans utförsbacke. Den är en del i en utbildning i förfall. Högskolans legitimitets- och statuskris väntar om hörnet. Förhoppningsvis är det fortfarande tid vända utvecklingen.

Mats Alvesson

Novemberrevolutionen

24 November, 2013 at 19:51 | Posted in Economics, Politics & Society | 1 Comment

Dan Josefssons lysande dokumentärfilm om Sveriges okända statskupp — genomförd den 21 november 1985 av en grupp sammansvurna nyliberala marknadsfundamentalister med  Riksbankschef Bengt Dennis, finansminister Kjell Olof Feldt och dennes närmaste man Erik Åsbrink i spetsen — finns nu att se på SvT Öppet arkiv.

novemberrevolutionen_1920

The particle physics of social science (wonkish)

24 November, 2013 at 15:34 | Posted in Statistics & Econometrics | Comments Off on The particle physics of social science (wonkish)

When you can’t see the forest because of the trees — that’s when a Factor Analysis might help you get on the right track:

Krugman on Sweden — as informative as junk mail before election day

24 November, 2013 at 11:10 | Posted in Economics, Politics & Society | 8 Comments

p-l-swe

In a post up yesterday, Paul Krugman writes about “Mysterious Swedes” (emphasis added):

The Riksbank raised rates sharply even though inflation was below target and falling, and has only partially reversed the move even though the country is now flirting with Japanese-style deflation. Why? Because it fears a housing bubble.

This kind of fits the H.L. Mencken definition of Puritanism: “The haunting fear that someone, somewhere, may be happy.” … The underlying deficiency of demand will call for pedal-to-the-medal monetary policy as a norm. But bubbles will happen — and central bankers, always looking for reasons to snatch away punch bowls, will use them as excuses to tighten.

This is, however, rather too simplistic a view of the problems facing the Swedish economy today. And just poohpoohing — has Krugman already forgotten what happened in the US back in 00:s I recommend consulting another Nobel laureate, Robert Shiller, who could probably freshen up his memory — deeply felt concerns and fears of a housing bubble with conspiracy theories (“excuses”) is debating economic policies analogous to playing tennis with the nets down.

So let’s try to get the picture right.

Lars E. O. Svensson – former deputy governor of the Riksbank – has repeatedly during the last year lambasted the Swedish Riksbank for having pursued a policy during the last fifteen years that has increased unemployment in Sweden:

The conclusion from the analysis is thus that the actual monetary policy has led to substantially lower inflation than the target and substantially higher unemployment than a policy that would have kept the policy rate unchanged at 0.25 percent.

The Riksbank has more recently justified the tight policy by maintaining that a lower policy rate would have increased the household debt ratio (debt relative to disposable income) and would have increased any risks connected with the debt. But, as I have shown … this is not true. A lower policy rate would have led to a lower debt ratio, not a higher one. This is because a lower policy rate increases the denominator (nominal disposable income) faster than the numerator (nominal debt). Then the debt ratio falls …

In summary, the Riksbank has conducted a monetary policy that has led to far too low inflation, far too high unemployment, and a somewhat higher debt ratio compared to if the policy rate had been left at 0.25 percent from the summer of 2010 until now. This is not a good result.

By the way, the latest report The Swedish Economy by the National Institute of Economic Research includes a very interesting special study, ”The Riksbank has systematically overestimated inflation,” which may be important in this context. In an analysis of the Riksbank’s inflation forecasts, the NIER shows that Riksbank forecasts have systematically overestimated inflation. The NIER concludes that “[t]he Riksbank’s overestimation of inflation has contributed to overly tight monetary policy with higher unemployment and lower inflation than would have been the case if, on average, its inflation forecasts had been on the mark.”

Why the majority of the Executive Board so systematically has exaggerated inflation risks so systematically is a question that may be worth returning to.

The Swedish Riksbank has according to Lars E. O. Svensson been pursuing a policy during the last fifteen years that in reality has made inflation on average more than half a percentage units lower than the goal set by the Riksbank. The Phillips Curve he estimates shows that unemployment as a result of this overly “austere” inflation level has been almost 1% higher than if one had stuck to the set inflation goal of 2%.

What Svensson is saying, without so many words, is that the Swedish Fed for no reason at all has made people unemployed. As a consequence of a faulty monetary policy the unemployment is considerably higher than it would have been if the Swedish Fed had done its job adequately.

So far, so good — I have no problem with Svensson’s argument about the inadequacy of the Swedish inflation targeting policies.

However, what makes the picture more complicated than Krugman — and Svensson — wants to admit, is that we do have a housing bubble in Sweden — it’s not just a figment of imagination the “bad guys” use to intimidate us with. [That said, I, of course, in no way want to imply that central bank  interest rate targeting (and/or accommodations) is the best way to counteract housing bubbles. Far from it.]

The increase in house loans – and house prices – in Sweden has for many years been among the steepest in the world.

Sweden’s house price boom started in mid-1990s, and looking at the development of real house prices since 1986, there are obvious reasons to be deeply worried:

Source: Statistics Sweden

The indebtedness of the Swedish household sector has also risen to alarmingly high levels, as indicated by the figure below (based on new data published earlier this year by Statistics Sweden, showing the development of household debts/disposable income 1990 – 2012):
householsdebts

Source: Statistics Sweden

As a result yours truly has been trying to argue with “very serious people” that it’s really high time to “take away the punch bowl.”

Sad to say, this is not the first time I see people like Krugman  having a warped view on what is really going on in Sweden.

In a recent post on Sweden and rising inequality, Paul Krugman writes:

[Y]ou have no business talking about international income distribution if you don’t know about the invaluable World Top Incomes Database. What does this database tell us about Sweden versus America?

DESCRIPTION

Hey, it looks just the same — or, actually, not.

Yes, the top one percent has risen a bit in Sweden. But how anyone could look at this and see the story as similar boggles the mind.

Although it’s not that often that yours truly disagree with Krugman on empirical matters, here I definitely think he’s wrong. It is indeed possible to see the story as similar.

Look at the figure below,  which shows how the distribution of mean income and wealth (expressed in year 2009 prices) for the top 0.1% and the bottom 90% has changed in Sweden for the last 30 years:


Source: The World Top Incomes Database

I would say the development in Sweden is deeply problematic and going in the wrong direction. The main difference compared to UK and US is really that the increasing inequality in Sweden (going on continuously for 30 years now) took off from a lower starting point.

The rising inequality has probably to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite – in Sweden as well as in the UK and the US.

So, I’m sorry Paul, but Sweden is no longer the model country it once was, in the heydays of the 60s and 70s. It’s no longer the country of Olof Palme. Neoliberal ideologies, economists and politicians have crushed the Swedish dream that once was. It’s sad. But it’s a fact.

Rethinking economics curriculum

23 November, 2013 at 10:01 | Posted in Economics | Comments Off on Rethinking economics curriculum

reimagine_education

For economists 2008 was a nightmare. The people who teach and research the discipline mocked by Thomas Carlyle, a 19th-century polemicist, as “the dismal science”, not only failed to spot the precipice, many forecast exactly the opposite—a tranquil stability they called the “great moderation”. While the global economy is slowly healing, the subject is still in a state of flux, with students eager to learn what went wrong, but frustrated by what they are taught. Some bold new projects to retune economics aim to change this …

For a start there is fresh blood coming into the subject. The numbers taking economics A-level—over 26,000 in June 2013—are at an all-time high. Many university students, however, are disappointed by what follows. At Manchester University, a student society has been set up to challenge the current syllabus. Among the demands are fewer lectures bogged down in detailed maths, and more time discussing important historical thinkers …

A new group of teachers is now listening. Led by Wendy Carlin, an economist at University College London, they are designing a new university-level curriculum. The project, which aims to launch for the 2014-15 academic year, will change things in a number of ways …

The academic contributors to Ms Carlin’s project are spread across nine countries … No one following the course will have to buy expensive books: the materials will be distributed to university departments without charge …

Brevan Howard, a hedge fund with assets of $40 billion, has founded a new financial stability research centre at Imperial College. On November 11th it announced that Franklin Allen and Douglas Gale, professors at Wharton Business School and New York University respectively, will be co-heads …

Ms Carlin’s project has benefited from hedge fund money too, with cash coming from the Institute for New Economic Thinking (INET) a think-tank set up by George Soros, an investor, in 2009. INET now funds $4m in economics projects per year, including a new research centre at Oxford University. Keynes too was an active investor who thought the role of economics was to protect the good things in life—music, art and intellectual life. He would have thoroughly approved.

The Economist

Jesus’ Blood Never Failed Me Yet

23 November, 2013 at 09:28 | Posted in Varia | Comments Off on Jesus’ Blood Never Failed Me Yet

 

Tankarna för mig hem till Leningrad igen

23 November, 2013 at 09:26 | Posted in Varia | Comments Off on Tankarna för mig hem till Leningrad igen

 

Hansson De Wolfe United

23 November, 2013 at 09:02 | Posted in Varia | Comments Off on Hansson De Wolfe United

 

Gammal kärlek rostar aldrig …

Read my lips — Olof Palme

22 November, 2013 at 22:22 | Posted in Varia | Comments Off on Read my lips — Olof Palme

 

Post-Crash Economics

22 November, 2013 at 00:05 | Posted in Economics | 2 Comments

The Manchester students’ proposals (Report, 25 October) are the latest in a long line of appeals by student bodies for a more pluralist and relevant curriculum, following actions by students at Harvard, Cambridge and Paris. In the early 1970s an active protest by students at the University of Sydney led to the reform of the university curriculum, to include “political economy”, ie economics that takes into account, power, politics and a contest of ideas. What all of these student protests share is the demand for economics that illuminates the real world, takes in multiple perspectives and engages them. Moreover, as the recent protests have shown, students recognise the value of it to them. Fortunately, even within the monoculture of the UK economics curriculum, there are institutions in which pluralist, realistic, relevant economics is taught.
Prof Victoria Chick University College LondonProf Bruce Cronin University of GreenwichProf Alan Freeman London Metropolitan UniversityDr Andrew Mearman University of the West of EnglandDr Jamie Morgan Leeds Metropolitan UniversityDr Ioana Negru Anglia Ruskin UniversityDr Wendy Olsen University of ManchesterDr Bruce Philp Nottingham Trent UniversityProf Molly Scott Cato Roehampton UniversityDr Pritam Singh Oxford Brookes University

The Guardian, Thursday 21 November 2013

A free rider in Berlin

21 November, 2013 at 15:34 | Posted in Politics & Society | Comments Off on A free rider in Berlin

 

[In case your German is a little bit rusty, here’s the movie with English subtitles]

Neoclassical economics — a religious belief?

21 November, 2013 at 08:22 | Posted in Economics | Comments Off on Neoclassical economics — a religious belief?

From any rational point of view, orthodox economics is in serious trouble. Its champions not only failed to foresee the greatest crash for 80 years, but insisted such crises were a thing of the past …

reality-checkAcclaimed figures in a discipline that claims to be scientific hailed a “great moderation” of market volatility in the runup to an explosion of unprecedented volatility. Others, such as the Nobel prizewinner Robert Lucas, insisted that economics had solved the “central problem of depression prevention”.

Any other profession that had proved so spectacularly wrong and caused such devastation would surely be in disgrace. You might even imagine the free-market economists who dominate our universities and advise governments and banks would be rethinking their theories and considering alternatives …

Eugene Fama, architect of the “efficient markets hypothesis” underpinning financial deregulation, concedes he doesn’t know what “causes recessions” – but insists his theory has been vindicated anyway. Most mainstream economists have carried on as if nothing had happened.

Many of their students, though, have had enough. A revolt against the orthodoxy has been smouldering for years and now seems to have gone critical. Fed up with parallel universe theories that have little to say about the world they’re interested in, students at Manchester University have set up a post-crash economics society with 800 members, demanding an end to monolithic neoclassical courses and the introduction of a pluralist curriculum.

edwardfullbrookThey want other schools of economic thought taught in parallel, from Keynesian to more radical theories – with a better record on predicting and connecting with the real world economy – along with green and feminist economics …  

Neoclassical economics has also provided the underpinning for the diet of deregulated markets, privatisation, low taxes on the wealthy and free trade we were told for 30 years was now the only route to prosperity.

Its supporters have an “almost religious mentality”, as Ha-Joon Chang – one of the last surviving independent economists at Keynes’s Cambridge – puts it. Although claiming to favour competition, the neoclassicals won’t tolerate any themselves …

But change it must. The free-market orthodoxy of the past three decades not only helped create the crisis we’re living through, but gave credibility to policies that have led to slower growth, deeper inequality, greater insecurity and environmental degradation all over the world. Its continued dominance after the crash, like the neoliberal model it underpins, is about power not credibility. If we are to escape this crisis, both will have to go.

Seumas Milne/The Guardian

[h/t Jan Milch]

Post-Keynesian Syll vs. New Keynesian Krugman & Co.

20 November, 2013 at 20:35 | Posted in Economics | Comments Off on Post-Keynesian Syll vs. New Keynesian Krugman & Co.

Enter Lars Syll, an economist at Malmö University in Sweden who specializes in economic philosophy and methodology and seems to be a kind of Post-Keynesian, not a New Keynesian like Krugman and Wren-Lewis. Post-Keynesians have long tried to return to Keynes’ original thought, particularly on uncertainty, and not on the disciples that followed.lars-palsson-syll,-professor-om-keynes,-loof,-hammarskjold Syll went hard after Wren-Lewis’ post with one of his own titled, “Wren-Lewis Drivelling on Macroeconomics.” Syll does not mince words: “Again, this self-congratulatory attitude. All macroeconomists share the same (mainstream neoclassical) basic theory, so when we discuss and argue it’s only about which policy and model to choose. All the more or less licensed and shared models and policies are already there on the shelf and we just have to decide which one to pick for today’s problem solving.” Syll declares a pox on both the Classical and the New Keynesians, both of which, he says, use unrealistic and unrepresentative models. Again, Krugman steps in to argue Wren-Lewis’ point: In research, playing with models, even if they are unrealistic, can help clarify thinking. It’s when you apply these models to policy issues that you usher politics into the game. Krugman makes the distinction between “gadgets,” that is some “brilliantly silly” theoretical models he has found useful, and fundamentals. A gadget is an aid to thought; a “fundamental” is some inevitably inadequate representation of reality. Krugman ends with a sigh. “But I guess not everyone on the sensible side [his side, of course] of macro sees it that way. And that is a problem. A gadget is a gadget, and you should not let it define your field.”

In a long post a day later, Syll describes himself as a heterodox economist (he now calls Krugman a “sort-kinda New Keynesian”) who draws his inspiration from Keynes, but not from those who followed him and tried to formalize his thinking, like John Hicks and his IS-LM model, which Krugman defended as an aid in his work on liquidity traps. Again, it’s all about the models–it’s all about how we even begin to think about the foundations, or the microfoundations, of economics. “On most macroeconomic policy discussions I find myself in agreement with Krugman,” writes Syll. “To me that just shows Krugman is right in spite of and not thanks to those models he ultimately refers to. When he is discussing austerity measures, ricardian equivalence or problems with the euro, he is actually not using those models, but rather simpler and more adequate and relevant thought constructions in the vein of Keynes … A gadget is just a gadget–and brilliantly silly models do not help us working with the fundamental issue of modern economics.”

OK, for anyone without an economics degree, this back-and-forth may seem like the nattering of some lost tribe in the jungle: Hicks and Keynes; Lucas and Krugman; views of dynamic stochastic general equilibrium models or the Dixit-Stiglitz model of monopolistic competition; gadgets and fundamentals; New Keynesians and Post-Keynesians; and a dash of intertemporal optimization. Where will it all end? Again, while the debate gets quickly technical, the larger question is sitting right there. What kind of endeavor is economics? Is it a science like physics? What are its limits? Can we accurately model human behavior? How much certainty can we reasonably expect from economists? This is not just ivory tower philosophizing. We live in an age when economics, for better or worse, rules. Nations rise and fall because of economics. Elections hang on judgments of economists, and vast forces are either restrained or unleashed (look at British austerity policies). They are the technocratic agents par excellence; they even get op-ed columns in major newspapers. It is useful to recall these debates over methodologies (which continue unabated, with newish posts from nearly everyone) when economists declare this or that with ineffable certainty. It may be wonkish, but it may be the most important debate of all.

Robert Teitelman/The Deal Pipeline

Krugman on math and models in economics

19 November, 2013 at 23:03 | Posted in Economics | 4 Comments

Paul Krugman had a post up on his blog a while ago where he argued that “Keynesian” macroeconomics more than anything else “made economics the model-oriented field it has become.” In Krugman’s eyes, Keynes was a “pretty klutzy modeler,” and it was only thanks to Samuelson’s famous 45-degree diagram and Hicks’s IS-LM that things got into place. Although admitting that economists have a tendency to use  “excessive math” and “equate hard math with quality” he still vehemently defends — and always have — the mathematization of economics:

I’ve seen quite a lot of what economics without math and models looks like — and it’s not good.

Sure, “New Keynesian” economists like Krugman — and their forerunners, “Keynesian” economists like Paul Samuelson and the young John Hicks — certainly have contributed to making economics more mathematical and “model-oriented.”

But if these math-is-the-message-modelers aren’t able to show that the mechanisms or causes that they isolate and handle in their mathematically formalized macromodels are stable in the sense that they do not change when we “export” them to our “target systems,” these mathematical models do only hold under ceteris paribus conditions and are consequently of limited value to our understandings, explanations or predictions of real economic systems. Or as the eminently quotable Keynes wrote already in Treatise on Probability (1921):

KeynesByGrant The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state … Yet there might well be quite different laws for wholes of different degrees of complexity, and laws of connection between complexes which could not be stated in terms of laws connecting individual parts … If different wholes were subject to different laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts … These considerations do not show us a way by which we can justify induction … /427 No one supposes that a good induction can be arrived at merely by counting cases. The business of strengthening the argument chiefly consists in determining whether the alleged association is stable, when accompanying conditions are varied … /468 In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

According to Keynes, science should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.”  We should look out for causal relations. But models — mathematical, econometric, or what have you — can never be more than a starting point in that endeavour. There is always the possibility that there are other (non-quantifiable) variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the formalized mathematical model.

These fundamental and radical problems are akin to those Keynes talked about when he launched his critique against the “atomistic fallacy” already in the 1920s:

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of laws and relations that “modern” economics has established, are laws and relations about mathematically formalized entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made mathematical-statistical “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of contemporary mainstream neoclassical endeavours of mathematical economic modeling rather useless. And that also goes for Krugman and the rest of the “New Keynesian” family.

A day of glory — we showed them what could be done

19 November, 2013 at 08:33 | Posted in Politics & Society | Comments Off on A day of glory — we showed them what could be done

 

Krugman & Co. — totally flabbergasting neoclassical apologetics

18 November, 2013 at 21:01 | Posted in Economics | 6 Comments

Is academic (mainstream neoclassical) macroeconomics flourishing? “New Keynesian” macroeconomist Simon Wren-Lewis had a post up not that long ago on his blog, answering the question affirmatively:

Consider monetary policy. I would argue that we have made great progress in both the analysis and practice of monetary policy over the last forty years … However, it has to be acknowledged that policymakers who look at the evidence day in and day out believe that New Keynesian theory is the most useful framework currently around. I have no problem with academics saying ‘I know this is the consensus, but I think it is wrong’. However to say ‘the jury is still out’ on whether prices are sticky is wrong. The relevant jury came to a verdict long ago …

It is obvious that when it comes to using fiscal policy in short term macroeconomic stabilisation there can be no equivalent claim to progress or consensus. The policy debates we have today do not seem to have advanced much since when Keynes was alive …

What has been missing with fiscal policy has been the equivalent of central bank economists whose job depends on taking an objective view of the evidence and doing the best they can with the ideas that academic macroeconomics provides…

The contrast between monetary and fiscal policy tells us that this failure is not an inevitable result of the paucity of evidence in macroeconomics. I think it has a lot more to do with the influence of ideology …

And today another sorta-kinda “New Keynesian” — Paul Krugman — has a post up arguing that the problem with the academic profession is that some macroeconomists aren’t “bothered to actually figure out” how the New Keynesian model with its Euler conditions —  “based on the assumption that people have perfect access to capital markets, so that they can borrow and lend at the same rate” — really works. According to Krugman, this shouldn’t  be hard at all — “at least it shouldn’t be for anyone with a graduate training in economics.”

Hmm. This doesn’t seem convincing at all.

1 So, according to Wren-Lewis, macroeconomics has really made progress on monetary issues thanks to central bank economists and since we have been able to definitely conclude that wages are “sticky”. Wow! That’s really impressive! (And the earth still isn’t flat?) Keynes argued that conclusively more than 75 years ago …

2 And are central bank economists really “taking an objective view”? How is it even possible to think that thought today, when the “relevant jury” for at least four years has known that these guys to a large extent were the culprits of the latest financial and economic crisis. Devoted Ayn Randian Alan Greenspan “taking an objective view”? I’ll be dipped!

3 Is ideology only playing a role when it comes to fiscal policies? Hard to believe. As already Gunnar Myrdal argued 80 years go, ideology is all over all economists. Whether they are into monetary or fiscal policies is immaterial.

4 And — perhaps most disturbing of all — in both Krugman and Wren-Lewis we see a rather unbecoming self-congratulatory attitude, according to which all macroeconomists (allegedly) share the same basic mainstream neoclassical theory, so when we discuss and argue it’s only about which policy and model to choose. All the more or less licensed and shared models and policies are already there on the shelf and we just have to decide which one — with or without Euler conditions — to pick for today’s monetary or fiscal problem solving. This is of course nothing but pure nonsense. Today’s macroeconomic debates are about so much more than selecting models and giving policy advice. And I think this is obvious for all that look further than the seminar room at the economics departments at Oxford University or at MIT.

Why statistics can never be the most important part of science

18 November, 2013 at 15:49 | Posted in Statistics & Econometrics | Comments Off on Why statistics can never be the most important part of science

‘There’s so much that goes on with data that is about computing, not statistics. I do think it would be fair to consider statistics (which includes sampling, experimental design, and data collection as well as data analysis (which itself includes model building, visualization, and model checking as well as inference)) as a subset of data science. . . .

The tech industry has always had to deal with databases and coding; that stuff is a necessity. The statistical part of data science is more of an option.

To put it another way: you can do tech without statistics but you can’t do it without coding and databases.’

This came up because I was at a meeting the other day … where people were discussing how statistics fits into data science. Statistics is important—don’t get me wrong—statistics helps us correct biases from nonrandom samples (and helps us reduce the bias at the sampling stage), statistics helps us estimate causal effects from observational data (and helps us collect data so that causal inference can be performed more directly), statistics helps us regularize so that we’re not overwhelmed by noise (that’s one of my favorite topics!), statistics helps us fit models, statistics helps us visualize data and models and patterns. Statistics can do all sorts of things. I love statistics! But it’s not the most important part of data science, or even close.

Andrew Gelman

Krugman and the rest of the family

18 November, 2013 at 15:33 | Posted in Economics | Comments Off on Krugman and the rest of the family

I simply can’t let Krugman … get away with writing off a large part of contemporary economic discourse (not to mention of the history of economic thought) and with his declaration that Larry Summers has “laid down what amounts to a very radical manifesto” (not to mention the fact that I was forced to waste the better part of a quarter of an hour this morning listening to Summers’s talk in honor of Stanley Fischer at the IMF Economic Forum, during which he announces that he’s finally discovered the possibility that the current level of economic stagnation may persist for some time).

Krugman may want to curse Summers out of professional jealousy. Me, I want to curse the lot of them—not only the MIT family but mainstream economists generally—for their utter cluelessness when it comes to making sense of (and maybe, eventually, actually doing something about) the current crises of capitalism.

So, what is he up to? Basically, Krugman showers Summers in lavish praise for his belated, warmed-over, and barely intelligible argument that attains what little virtue it has about the economic challenges we face right now by vaguely resembling the most rudimentary aspects of what people who read and build on the ideas of Marx, Kalecki, Minsky, and others have been saying and writing for years. The once-and-former-failed candidate for head of the Federal Reserve begins with the usual mainstream conceit that they successfully solved the global financial crash of 2008 and that current economic events bear no resemblance to the First Great Depression. But then reality sinks in: since in their models the real interest-rate consistent with full employment is currently negative (and therefore traditional monetary policy doesn’t amount to much more than pushing on a string), we may be in for a rough ride (with high output gaps and persistent unemployment) for some unknown period of time. And, finally, an admission that the conditions for this “secular stagnation” may actually have characterized the years of bubble and bust leading up to the crisis of 2007-08.

That’s where Krugman chimes in, basking in the glow of his praise for Summers, expressing for the umpteenth time the confidence that his simple Keynesian model of the liquidity trap and zero lower bound has been vindicated. The problem is, Summers can’t even give Alvin Hansen, the first American economist to explicate and domesticate Keynes’s ideas, and the one who first came up with the idea of secular stagnation based on the Bastard Keynesian IS-LM model, his due … I guess it’s simply too much to expect they actually recognize, read, and learn from other traditions within economics …

And things only go down from there. Because the best Summers and Krugman can do by way of attempting to explain the possibility of secular stagnation is not to analyze the problems embedded in and created by existing economic institutions but, instead, to invoke that traditional deus ex machina, demography.

“Now look forward. The Census projects that the population aged 18 to 64 will grow at an annual rate of only 0.2 percent between 2015 and 2025. Unless labor force participation not only stops declining but starts rising rapidly again, this means a slower-growth economy, and thanks to the accelerator effect, lower investment demand.”

You would think that a decent economist, not even a particularly left-wing one, might be able to imagine the possibility that a labor shortage might cause higher real wages, which might have myriad other effects, many of them really, really good—not only for people who continue to be forced to have the freedom to sell their ability to work but also for their families, their neighbors, and for lots of other participants in the economy. But, apparently, stagnant wages (never mind supply-and-demand) are just as “natural” as Wicksell’s natural interest rate …

It is no wonder, then, that mainstream economists—even the best of them—are so painfully inarticulate and hamstrung when it comes to making sense of the current economic malaise.

I’ll admit, it wouldn’t be so bad if it was just a matter of professional jealousy and their not being able to analyze what is going on except through the workings of a small number of familiar assumptions and models. They talk as if it’s only their academic reputations that are on the line. But we can’t forget there are millions and millions of people, young and old, in the United States and around the world, whose lives hang in the balance—well-intentioned and hard-working people who are being made to pay the costs of economists like Krugman attempting to keep things all in the family.

David Ruccio (Professor of Economics)

Top 10 Heterodox Economics Books

17 November, 2013 at 17:10 | Posted in Economics | 11 Comments

top-10-retail-news-thumb-610xauto-79997-600x240-1

  • Karl Marx, Das Kapital (1867)
  • John Kenneth Galbraith, The Affluent Society (1958)
  • Piero Sraffa, Production of Commodities by Means of Commodities (1960)
  • Nicholas Georgescu-Roegen, The Entropy Law and the Economic Process (1971)
  • Michal Kalecki, Selected Essays on the Dynamics of the Capitalist Economy (1971)
  • Paul Davidson, Money and the Real World (1972)
  • Hyman Minsky, John Maynard Keynes (1975)
  • Tony Lawson, Economics and Reality (1997)
  • Steve Keen, Debunking Economics (2001)
  • John Quiggin, Zombie Economics (2010)

John Tavener (1944 — 2013)

17 November, 2013 at 11:07 | Posted in Varia | Comments Off on John Tavener (1944 — 2013)

 

Transmogrification of truth in economics textbooks — expected utility theory

16 November, 2013 at 15:09 | Posted in Economics | 7 Comments

Although the expected utility theory is obviously both theoretically and descriptively inadequate, colleagues and microeconomics textbook writers all over the world gladly continue to use it, as though its deficiencies were unknown or unheard of.

Not even Robert Frank — in one of my favourite intermediate textbooks on microeconomics — manages to get it quite right on this issue:

As a general rule, human nature obviously prefers certainty to risk. At the same time, however, risk is an inescapable part of the environment. People naturally want the largest possible gain and the smallest possible risk, but most of the time we are forced to trade risk and gain off against one another. frankWhen choosing between two risky alternatives, we are forced to recognize this trade-off explicitly. In such cases, we cannot escape the cognitive effort required to reach a sensible decision. But when one of the alternatives is riskless, it is often easier simply to choose it and not waste too much effort on the decision. What this pattern of behavior fails to recognize, however, is that choosing a sure win of $30 over an 80 percent chance to win $45 does precious little to reduce any of the uncertainty that really matters in life.

On the contrary, when only small sums of money are at stake, a compelling case can be made that the only sensible strategy is to choose the alternative with the highest expected value. The argument for this strategy … rests on the law of large numbers. Here, the law tells us that if we take a large number of independent gambles and pool them, we can be very confident of getting almost exactly the sum of their expected values. As a decision maker, the trick is to remind yourself that each small risky choice is simply part of a much larger collection. After all, it takes the sting out of an occasional small loss to know that following any other strategy would have led to a virtually certain large loss.

To illustrate, consider again the choice between the sure gain of $30 and the 80 percent chance to win $45, and suppose you were confronted with the equivalent of one such choice each week. Recall that the gamble has an expected value of $36, $6 more than the sure thing. By always choosing the “risky” alternative, your expected gain — over and beyond the gain from the sure alternative — will be $312 each year. Students who have had an introductory course in probability can easily show that the probability you would have come out better by choosing the sure alternative in any year is less than 1 percent. The long-run opportunity cost of following a risk-averse strategy for decisions involving small outcomes is an almost sure LOSS of considerable magnitude. By thinking of your problem as that of choosing a policy for dealing with a large number of choices of the same type, a seemingly risky strategy is transformed into an obviously very safe one.

What Frank — and other textbook authors — tries to do in face of the obvious behavioural inadequacies of the expected utility theory, is to marginally mend it. But that cannot be the right attitude when facing scientific anomalies. When models are plainly wrong, you’d better replace them! As Matthew Rabin and Richard Thaler have it in Risk Aversion:

It is time for economists to recognize that expected utility is an ex-hypothesis, so that we can concentrate our energies on the important task of developing better descriptive models of choice under uncertainty.

If a friend of yours offered you a gamble on the toss of a coin where you could lose €100 or win €200, would you accept it? Probably not. But if you were offered to make one hundred such bets, you would probably be willing to accept it, since most of us see that the aggregated gamble of one hundred 50–50 lose €100/gain €200 bets has an expected return of €5000 (and making our probabilistic calculations we find out that there is only a 0.04% risk of losing any money).

Unfortunately – at least if you want to adhere to the standard neoclassical expected utility maximization theory – you are then considered irrational! A neoclassical utility maximizer that rejects the single gamble should also reject the aggregate offer.

In his modern classic Risk Aversion and Expected-Utility Theory: A Calibration Theorem Matthew Rabin  writes:

Using expected-utility theory, economists model risk aversion as arising solely because the utility function over wealth is concave. This diminishing-marginal-utility-of-wealth theory of risk aversion is psychologically intuitive, and surely helps explain some of our aversion to large-scale risk: We dislike vast uncertainty in lifetime wealth because a dollar that helps us avoid poverty is more valuable than a dollar that helps us become very rich.

Yet this theory also implies that people are approximately risk neutral when stakes are small. Arrow (1971, p. 100) shows that an expected-utility maximizer with a differentiable utility function will always want to take a sufficiently small stake in any positive-expected-value bet. That is, expected-utility maximizers are (almost everywhere) arbitrarily close to risk neutral when stakes are arbitrarily small. While most economists understand this formal limit result, fewer appreciate that the approximate risk-neutrality prediction holds not just for negligible stakes, but for quite sizable and economically important stakes. Economists often invoke expected-utility theory to explain substantial (observed or posited) risk aversion over stakes where the theory actually predicts virtual risk neutrality.While not broadly appreciated, the inability of expected-utility theory to provide a plausible account of risk aversion over modest stakes has become oral tradition among some subsets of researchers, and has been illustrated in writing in a variety of different contexts using standard utility functions.

In this paper, I reinforce this previous research by presenting a theorem which calibrates a relationship between risk attitudes over small and large stakes. The theorem shows that, within the expected-utility model, anything but virtual risk neutrality over modest stakes implies manifestly unrealistic risk aversion over large stakes. The theorem is entirely ‘‘non-parametric’’, assuming nothing about the utility function except concavity. In the next section I illustrate implications of the theorem with examples of the form ‘‘If an expected-utility maximizer always turns down modest-stakes gamble X, she will always turn down large-stakes gamble Y.’’ Suppose that, from any initial wealth level, a person turns down gambles where she loses $100 or gains $110, each with 50% probability. Then she will turn down 50-50 bets of losing $1,000 or gaining any sum of money. A person who would always turn down 50-50 lose $1,000/gain $1,050 bets would always turn down 50-50 bets of losing $20,000 or gaining any sum. These are implausible degrees of risk aversion. The theorem not only yields implications if we know somebody will turn down a bet for all initial wealth levels. Suppose we knew a risk-averse person turns down 50-50 lose $100/gain $105 bets for any lifetime wealth level less than $350,000, but knew nothing about the degree of her risk aversion for wealth levels above $350,000. Then we know that from an initial wealth level of $340,000 the person will turn down a 50-50 bet of losing $4,000 and gaining $635,670.

The intuition for such examples, and for the theorem itself, is that within the expected-utility framework turning down a modest-stakes gamble means that the marginal utility of money must diminish very quickly for small changes in wealth. For instance, if you reject a 50-50 lose $10/gain $11 gamble because of diminishing marginal utility, it must be that you value the 11th dollar above your current wealth by at most 10/11 as much as you valued the 10th-to-last-dollar of your current wealth.

Iterating this observation, if you have the same aversion to the lose $10/gain $11 bet if you were $21 wealthier, you value the 32nd dollar above your current wealth by at most 10/11 x 10/11 ~ 5/6 as much as your 10th-to-last dollar. You will value your 220th dollar by at most 3/20 as much as your last dollar, and your 880 dollar by at most 1/2000 of your last dollar. This is an absurd rate for the value of money to deteriorate — and the theorem shows the rate of deterioration implied by expected-utility theory is actually quicker than this. Indeed, the theorem is really just an algebraic articulation of how implausible it is that the consumption value of a dollar changes significantly as a function of whether your lifetime wealth is $10, $100, or even $1,000 higher or lower. From such observations we should conclude that aversion to modest-stakes risk has nothing to do with the diminishing marginal utility of wealth.

Expected-utility theory seems to be a useful and adequate model of risk aversion for many purposes, and it is especially attractive in lieu of an equally tractable alternative model. ‘‘Extremelyconcave expected utility’’ may even be useful as a parsimonious tool for modeling aversion to modest-scale risk. But this and previous papers make clear that expected-utility theory is manifestly not close to the right explanation of risk attitudes over modest stakes. Moreover, when the specific structure of expected-utility theory is used to analyze situations involving modest stakes — such as in research that assumes that large-stake and modest-stake risk attitudes derive from the same utility-for-wealth function — it can be very misleading. In the concluding section, I discuss a few examples of such research where the expected-utility hypothesis is detrimentally maintained, and speculate very briefly on what set of ingredients may be needed to provide a better account of risk attitudes. In the next section, I discuss the theorem and illustrate its implications …

Expected-utility theory makes wrong predictions about the relationship between risk aversion over modest stakes and risk aversion over large stakes. Hence, when measuring risk attitudes maintaining the expected-utility hypothesis, differences in estimates of risk attitudes may come from differences in the scale of risk comprising data sets, rather than from differences in risk attitudes of the people being studied. Data sets dominated by modest-risk investment opportunities are likely to yield much higher estimates of risk aversion than data sets dominated by larger-scale investment opportunities. So not only are standard measures of risk aversion somewhat hard to interpret given that people are not expected-utility maximizers, but even attempts to compare risk attitudes so as to compare across groups will be misleading unless economists pay due attention to the theory’s calibrational problems …

Indeed, what is empirically the most firmly established feature of risk preferences, loss aversion, is a departure from expected-utility theory that provides a direct explanation for modest-scale risk aversion. Loss aversion says that people are significantly more averse to losses relative to the status quo than they are attracted by gains, and more generally that people’s utilities are determined by changes in wealth rather than absolute levels. Preferences incorporating loss aversion can reconcile significant small-scale risk aversion with reasonable degrees of large-scale risk aversion … Variants of this or other models of risk attitudes can provide useful alternatives to expected-utility theory that can reconcile plausible risk attitudes over large stakes with non-trivial risk aversion over modest stakes.

In a similar vein, Daniel Kahneman writes — in Thinking, Fast and Slow — that expected utility theory is seriously flawed since it doesn’t take into consideration the basic fact that people’s choices are influenced by changes in their wealth. Where standard microeconomic theory assumes that preferences are stable over time, Kahneman and other behavioural economists have forcefully again and again shown that preferences aren’t fixed, but vary with different reference points. How can a theory that doesn’t allow for people having different reference points from which they consider their options have an almost axiomatic status within economic theory?

The mystery is how a conception of the utility of outcomes that is vulnerable to such obvious counterexamples survived for so long. I can explain it only by a weakness of the scholarly mind … I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking it is extraordinarily difficult to notice its flaws … You give the theory the benefit of the doubt, trusting the community of experts who have accepted it … But they did not pursue the idea to the point of saying, “This theory is seriously wrong because it ignores the fact that utility depends on the history of one’s wealth, not only present wealth.”

On a more economic-theoretical level, information theory – and especially the so called the Kelly theorem – also highlights the problems concerning the neoclassical theory of expected utility.
Suppose I want to play a game. Let’s say we are tossing a coin. If heads comes up, I win a dollar, and if tails comes up, I lose a dollar. Suppose further that I believe I know that the coin is asymmetrical and that the probability of getting heads (p) is greater than 50% – say 60% (0.6) – while the bookmaker assumes that the coin is totally symmetric. How much of my bankroll (T), should I optimally invest in this game?

A strict neoclassical utility-maximizing economist would suggest that my goal should be to maximize the expected value of my bankroll (wealth), and according to this view, I ought to bet my entire bankroll.

Does that sound rational? Most people would answer no to that question. The risk of losing is so high, that I already after few games played – the expected time until my first loss arises is 1/(1-p), which in this case is equal to 2.5 – with a high likelihood would be losing and thereby become bankrupt. The expected-value maximizing economist does not seem to have a particularly attractive approach.

So what’s the alternative? One possibility is to apply the so-called Kelly-strategy – after the American physicist and information theorist John L. Kelly, who in the article A New Interpretation of Information Rate (1956) suggested this criterion for how to optimize the size of the bet – under which the optimum is to invest a specific fraction (x) of wealth (T) in each game. How do we arrive at this fraction?

When I win, I have (1 + x) times more than before, and when I lose (1 – x) times less. After n rounds, when I have won v times and lost n – v times, my new bankroll (W) is

(1) W = (1 + x)v(1 – x)n – v T

The bankroll increases multiplicatively – “compound interest” – and the long-term average growth rate for my wealth can then be easily calculated by taking the logarithms of (1), which gives

(2) log (W/ T) = v log (1 + x) + (n – v) log (1 – x).

If we divide both sides by n we get

(3) [log (W / T)] / n = [v log (1 + x) + (n – v) log (1 – x)] / n

The left hand side now represents the average growth rate (g) in each game. On the right hand side the ratio v/n is equal to the percentage of bets that I won, and when n is large, this fraction will be close to p. Similarly, (n – v)/n is close to (1 – p). When the number of bets is large, the average growth rate is

(4) g = p log (1 + x) + (1 – p) log (1 – x).

Now we can easily determine the value of x that maximizes g:

(5) d [p log (1 + x) + (1 – p) log (1 – x)]/d x = p/(1 + x) – (1 – p)/(1 – x) =>
p/(1 + x) – (1 – p)/(1 – x) = 0 =>

(6) x = p – (1 – p)

Since p is the probability that I will win, and (1 – p) is the probability that I will lose, the Kelly strategy says that to optimize the growth rate of your bankroll (wealth) you should invest a fraction of the bankroll equal to the difference of the likelihood that you will win or lose. In our example, this means that I have in each game to bet the fraction of x = 0.6 – (1 – 0.6) ≈ 0.2 – that is, 20% of my bankroll. The optimal average growth rate becomes

(7) 0.6 log (1.2) + 0.4 log (0.8) ≈ 0.02.

If I bet 20% of my wealth in tossing the coin, I will after 10 games on average have 1.0210 times more than when I started (≈ 1.22 times more).

This game strategy will give us an outcome in the long run that is better than if we use a strategy building on the neoclassical economic theory of choice under uncertainty (risk) – expected value maximization. If we bet all our wealth in each game we will most likely lose our fortune, but because with low probability we will have a very large fortune, the expected value is still high. For a real-life player – for whom there is very little to benefit from this type of ensemble-average – it is more relevant to look at time-average of what he may be expected to win (in our game the averages are the same only if we assume that the player has a logarithmic utility function). What good does it do me if my tossing the coin maximizes an expected value when I might have gone bankrupt after four games played? If I try to maximize the expected value, the probability of bankruptcy soon gets close to one. Better then to invest 20% of my wealth in each game and maximize my long-term average wealth growth!

When applied to the neoclassical theory of expected utility, one thinks in terms of “parallel universe” and asks what is the expected return of an investment, calculated as an average over the “parallel universe”? In our coin toss example, it is as if one supposes that various “I” are tossing a coin and that the loss of many of them will be offset by the huge profits one of these “I” does. But this ensemble-average does not work for an individual, for whom a time-average better reflects the experience made in the “non-parallel universe” in which we live.

The Kelly strategy gives a more realistic answer, where one thinks in terms of the only universe we actually live in, and ask what is the expected return of an investment, calculated as an average over time.

Since we cannot go back in time – entropy and the “arrow of time ” make this impossible – and the bankruptcy option is always at hand (extreme events and “black swans” are always possible) we have nothing to gain from thinking in terms of ensembles .

Actual events follow a fixed pattern of time, where events are often linked in a multiplicative process (as e. g. investment returns with “compound interest”) which is basically non-ergodic.

Instead of arbitrarily assuming that people have a certain type of utility function – as in the neoclassical theory – the Kelly criterion shows that we can obtain a less arbitrary and more accurate picture of real people’s decisions and actions by basically assuming that time is irreversible. When the bankroll is gone, it’s gone. The fact that in a parallel universe it could conceivably have been refilled, are of little comfort to those who live in the one and only possible world that we call the real world.

Our coin toss example can be applied to more traditional economic issues. If we think of an investor, we can basically describe his situation in terms of our coin toss. What fraction (x) of his assets (T) should an investor – who is about to make a large number of repeated investments – bet on his feeling that he can better evaluate an investment (p = 0.6) than the market (p = 0.5)? The greater the x, the greater is the leverage. But also – the greater is the risk. Since p is the probability that his investment valuation is correct and (1 – p) is the probability that the market’s valuation is correct, it means the Kelly strategy says he optimizes the rate of growth on his investments by investing a fraction of his assets that is equal to the difference in the probability that he will “win” or “lose”. In our example this means that he at each investment opportunity is to invest the fraction of x = 0.6 – (1 – 0.6), i.e. about 20% of his assets. The optimal average growth rate of investment is then about 11% (0.6 log (1.2) + 0.4 log (0.8)).

Kelly’s criterion shows that because we cannot go back in time, we should not take excessive risks. High leverage increases the risk of bankruptcy. This should also be a warning for the financial world, where the constant quest for greater and greater leverage – and risks – creates extensive and recurrent systemic crises. A more appropriate level of risk-taking is a necessary ingredient in a policy to come to curb excessive risk taking.

The works of people like Rabin, Thaler, Kelly, and Kahneman, show that expected utility theory is in deed transmogrifying truth. It’s an “ex-hypthesis”  — or as Monty Python has it:

ex-ParrotThis parrot is no more! He has ceased to be! ‘E’s expired and gone to meet ‘is maker! ‘E’s a stiff! Bereft of life, ‘e rests in peace! If you hadn’t nailed ‘im to the perch ‘e’d be pushing up the daisies! ‘Is metabolic processes are now ‘istory! ‘E’s off the twig! ‘E’s kicked the bucket, ‘e’s shuffled off ‘is mortal coil, run down the curtain and joined the bleedin’ choir invisible!! THIS IS AN EX-PARROT!!

On causality and the dangers of eating ice cream …

15 November, 2013 at 17:24 | Posted in Statistics & Econometrics | 2 Comments

Reforming economics education

14 November, 2013 at 21:33 | Posted in Economics, Education & School | Comments Off on Reforming economics education

reform

Evidence can be of many kinds, and different types of course have different requirements … For economics, data handling has featured prominently in the discussions; in addition, it is important for students to have an appreciation of where data come from, and how this may affect their quality — the strengths and weaknesses of key datasets. In substantive analyses attention should be directed more at what is driving the data, causally, than on quantification of estimates.

There are important sources of evidence other than statistical data, for example surveys, qualitative as well as quantitative. Evidence need not consist only of generalisations; specific events and case studies can also be instructive. So can descriptive studies, including on how particular sub-systems of the economy work. Secure knowledge comes from bringing different approaches together, and if necessary addressing any inconsistencies …

Behavioural economics is now part of the mainstream. This signifies a shift in emphasis towards how things actually happen, away from a focus on how they might happen in an ideal world …

Economic history is substantively important. It is the indispensable record of how economies actually behave, the particular structures they have, and how they change. In addition, economic historians pay a great deal of attention to the above-mentioned issues such as data quality, and the discipline encompasses specific events and narrative as well as quantitative methods and consideration of general historical processes.

The reader could be forgiven for responding that all this is straightforward commonsense. However, there is a problem … The unfortunate truth is that ‘mainstream’ economics still contains elements that are contradicted by the evidence, but which are presented as if they were true …

Falsity is not the same as simplification. For specific modelling purposes, a particular assumption may be useful, even though it is known to be a gross over-simplification. The classic case is rationality: it is clear that economic behaviour does not always accord with the tenets of classical rationality, but there may well be situations in which the rationality assumption is both useful and harmless. In such cases, it would be explicit that its inclusion is justified pragmatically, and should not be taken to imply that it is a realistic description of the actually-existing psychological process …

We need to encourage an attitude in our future economists that they are as aware of the many features and forces of real situations as they are of the simplified, and hopefully elegant, model. Thus, if a macro model omits the financial sector, implicitly assuming that it plays only a passive role, then this omission would be visible. Students need to have a broad and multi-faceted understanding of the real situation, so that they can see which elements have been selected for the model, and which have been omitted. Again it is a question of being able to see what is not there …

For the curriculum … a better balance is needed. In particular, the mutual dependence of theoretical categories and evidence should be emphasised. The discourse should focus less on competing theories and models in the abstract, and more on the relation between each of these theories/models and the evidence. Students should be judged as much on knowledge of the real world as on knowledge of models.

Michael Joffe/Royal Economic Society

Next Page »

Blog at WordPress.com.
Entries and comments feeds.