Probability and rationality — trickier than you may think

15 Oct, 2016 at 23:05 | Posted in Statistics & Econometrics | 40 Comments

The Coin-tossing Problem

My friend Ben says that on the first day he got the following sequence of Heads and Tails when tossing a coin:
H H H H H H H H H H

And on the second day he says that he got the following sequence:
H T T H H T T H T H

184bic9u2w483jpgWhich day-report makes you suspicious?
Most people I ask this question says the first day-report looks suspicious.

But actually both days are equally probable! Every time you toss a (fair) coin there is the same probability (50 %) of getting H or T. Both days Ben makes equally many tosses and every sequence is equally probable!

The Linda Problem

Linda is 40 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which of the following two alternatives is more probable?

A. Linda is a bank teller.
B. Linda is a bank teller and active in the feminist movement.

‘Rationally,’ alternative B cannot be more likely than alternative A. Nonetheless Amos Tversky and Daniel Kahneman reported — ‘Judgments of and by representativeness.’ In D. Kahneman, P. Slovic & A. Tversky (Eds.), Judgment under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press 1982 — that more than 80 percent of respondents said that it was.

Why do we make such ‘irrational’ judgments in both these cases? Tversky and Kahneman argued that in making this kind of judgment we seek the closest resemblance between causes and effects (in The Linda Problem, between Linda’s personality and her behaviour), rather than calculating probability, and that this makes alternative B seem preferable. By using a heuristic called representativeness, statement B in The Linda Problem seems more ‘representative’ of Linda based on the description of her, although from a probabilistic point of view it is clearly less likely.

Microfoundational angels

14 Oct, 2016 at 17:15 | Posted in Economics | Comments Off on Microfoundational angels

goodhartAmongst the several problems/disadvantages of this current consensus is that, in order to make a rational expectations, micro-founded model mathematically and analytically tractable it has been necessary in general to impose some (absurdly) simplifying assumptions, notably the existence of representative agents, who never default.This latter (nonsensical) assumption goes under the jargon term as the transversality condition.

This makes all agents perfectly creditworthy. Over any horizon there is only one interest rate facing all agents, i.e. no risk premia. All transactions can be undertaken in capital markets; there is no role for banks. Since all IOUs are perfectly creditworthy, there is no need for money. There are no credit constraints. Everyone is angelic; there is no fraud; and this is supposed to be properly micro-founded!

Charles Goodhart

Svensk skola — ett fullständigt haveri

14 Oct, 2016 at 16:56 | Posted in Education & School | Comments Off on Svensk skola — ett fullständigt haveri

Sen 2014 har andelen toppbetyg fördubblats från 0,25 procent till dagens 0,5 procent (knappt 500 elever). Denna andel är nästan 40 (!) gånger så stor som andelen med maximala betyg (5,0) under det relativa betygssystemet i början av 1990-talet …

slide_3Det finns flera problem med denna utveckling. När betygen stiger och fler slår i taket så försvåras urvalet till vidare studier och varje enskilt betyg blir helt avgörande för huruvida eleven hamnar i den växande gruppen maxbetygare. Det är också känt att kunskapsnivån faller när det är lätt att få höga betyg. Ett ännu större problem är dock den bristande likvärdigheten; samma kunskapsnivå kan ge helt olika betyg beroende på vilken skola en elev går på och vilken lärare eleven har …

För alla någorlunda insatta betraktare torde det vid det här laget står klart att det svenska betygssystemet saknar alla spärrar och att betygen stiger utan hämningar. Trots detta har inga verksamma åtgärder vidtagits för att göra något åt vare sig betygsinflationen eller den bristande likvärdigheten i betygssättningen. Blocköverskridande överenskommelser är ofta en fördel när det gäller skolfrågor, men det är att beklaga att enigheten aldrig verkar vara så stor som när det gäller att låta betygshaveriet ha sin gilla gång.

Jonas Vlachos

Som svensk skolforskning kunnat visa under en längre tid nu är betygsinflationen ett stort och allvarligt problem för svensk skola av idag. Men tyvärr inte det enda.

År efter år kommer larmrapporter om hur illa det står till i svensk skola. PISA och andra studier visar otvetydigt att svenska skolelever presterar sämre och sämre. Och vi som arbetar inom universitetsvärlden märker av att våra studenter i allt större utsträckning saknar nödiga förkunskaper för att kunna bedriva seriösa studier.

År efter år ser vi hur viljan att bli lärare minskar. I början på 1980-talet fanns det nästan åtta sökande per plats på lågstadielärarutbildningen. Idag är det en sökande per plats på grundlärarutbildningen. Detta är en samhällskatastrof som vi borde tala om. I en värld där allt hänger på kunskap är det på sikt avgörande för svensk ekonomi att åter göra läraryrket attraktivt.

År efter år ser vi hur lärarlönerna urholkas. För ett par år sedan presenterade OECD en rapport där man menar sig kunna visa att framgångsrika skolnationer tenderar att prioritera höga lärarlöner. Lärarlönerna som andel av BNP per capita är i Sverige väsentligt lägre än i de länder som ligger i topp i PISA-studierna.

År efter år ser vi hur ojämlikheten ökar på många områden. Inte minst vad avser inkomster och förmögenhet. Skillnader i livsbetingelser för olika grupper vad avser klass, etnicitet och genus är oacceptabelt stora.

År efter år kan vi konstatera att i skolans värld har uppenbarligen familjebakgrunden fortfarande stor betydelse för elevers prestationer. Självklart kan det inte uppfattas som annat än ett kapitalt misslyckande för en skola med kompensatoriska aspirationer.

tilsam

År efter år kan vi notera att tvärtemot alla reformpedagogiska utfästelser så är det främst barn ur hem utan studietraditioner som förlorat i den omläggning i synen på skolan som skett under det senaste halvseklet. I dag – med skolpengar, fria skolval och friskolor – har utvecklingen tvärtemot alla kompensatoriska utfästelser bara ytterligare stärkt de högutbildade föräldrarnas möjligheter att styra de egna barnens skolgång och framtid. Det är svårt att se vilka som med dagens skola ska kunna göra den ‘klassresa’ så många i min generation har gjort.

Höjda lärarlöner är inte en tillräcklig förutsättning för att vi åter ska få en svensk skola av världsklass. Men det är en nödvändig förutsättning — inte minst för att locka de riktigt duktiga ungdomarna att satsa på en lärarkarriär. Omfattande skolforskning har övertygande visat att det kommunala huvudmannaskapet är en av de viktigaste orsakerna bakom lärarlönernas och den svenska skolans kräftgång de senaste decennierna.

De politisk partierna måste droppa sina ideologiska skygglappar och inse att en och annan helig ko måste slaktas om vi ska få rätt på svensk skola. När skolfakta sparkar så får man vara så god att ändra kurs – även om det eventuellt skulle stå i strid med ideologin. När ska de politiska partierna gemensamt våga ta det steget? Ska vi verkligen behöva vänta tills nästa PISA undersökning åter igen pekar på svensk skolas katastrofala utförsåkning?

Jag har sagt det förr — och jag säger det igen: kommunaliseringen av skolan är den största floppen någonsin i svensk utbildningspolitisk historia. Men misstag går att rätta till. Som den store engelske nationalekonomen John Maynard Keynes brukade säga: “When I’m wrong, I change my mind.”

Återförstatliga svensk skola!

Ricardian equivalence — hopelessly unrealistic

14 Oct, 2016 at 13:36 | Posted in Economics | Comments Off on Ricardian equivalence — hopelessly unrealistic

According to the Ricardian equivalence hypothesis the public sector basically finances its expenditures through taxes or by issuing bonds, and bonds must sooner or later be repaid by raising taxes in the future.

If the public sector runs extra spending through deficits, taxpayers will according to the hypothesis anticipate that they will have pay higher taxes in future — and therefore increase their savings and reduce their current consumption to be able to do so, the consequence being that aggregate demand would not be different to what would happen if taxes were rised today.

Robert Barro attempted to give the proposition a firm theoretical foundation in the 1970s.

So let us get the facts straight from the horse’s mouth.

Describing the Ricardian Equivalence in 1989 Barro writes (emphasis added):

Suppose now that households’ demands for goods depend on the expected present value of taxes—that is, each household subtracts its share of this present value from the expected present value of income to determine a net wealth position. Then fiscal policy would affect aggregate consumer demand only if it altered the expected present value of taxes. But the preceding argument was that the present value of taxes would not change as long as the present value of spending did not change. Therefore, the substitution of a budget deficit for current taxes (or any other rearrangement of the timing of taxes) has no impact on the aggregate demand for goods. In this sense, budget deficits and taxation have equivalent effects on the economy — hence the term, “Ricardian equivalence theorem.” To put the equivalence result another way, a decrease in the government’s saving (that is, a current budget deficit) leads to an offsetting increase in desired private saving, and hence to no change in desired national saving.

Since desired national saving does not change, the real interest rate does not have to rise in a closed economy to maintain balance between desired national saving and investment demand. Hence, there is no effect on investment, and no burden of the public debt …

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

To most people this probably sounds like nothing but witless gibberish, and Willem Buiter is, indeed, in no gracious mood when commenting on it a couple years ago:

Barro (1974) has shown that, given perfect foresight, debt neutrality will obtain when three conditions are met: (a) private agents can lend and borrow on the same terms as the government, (b) private agents are able and willing to undo any government scheme to redistribute spending power between generations, and (c) all taxes and transfer payments are lump sum, by which we mean that their basis of assessment is independent of private agents’ decisions about production, labour supply, consumption, or asset accumulation. Under these extreme assumptions, any change in government financing (government saving or dissaving) is offset one-for-one by a corresponding change in private saving itself financed by the accompanying tax changes.

aatical-economics-are-mere-concoctions-as-imprecise-as-the-john-maynard-keynes-243582All three assumptions are of course hopelessly unrealistic. Condition (a) fails because credit rationing, liquidity constraints, large spreads between lending and borrowing rates of interest, and private borrowing rates well in excess of those enjoyed by the government are an established fact in most industrial countries. These empirical findings are underpinned by the new and burgeoning theoretical literature on asymmetric information and the implications of moral hazard and adverse selection for private financial marketsl1; and by game-theoretic insights of how active competition in financial markets can yield credit rationing as the equilibrium outcome.

Condition (b) fails because it requires either that agents must live for ever or else effectively do so through the account they take of their children and parents in making gifts and bequests. In reality, private decision horizons are finite and frequently quite short …

Condition (c) fails because in practice taxes and subsidies are rarely lump sum …

I conclude that the possible neutrality of public debt and deficits is little more than a theoretical curiosum.

Willem Buiter

It is difficult not to agree in that verdict.

And how about the empirics? Let’s have a look:

In a series of recent papers … I and co-authors measure the impact of the receipt of an economic stimulus payment in the US in 2008 on a household’s spending by comparing the spending patterns of households that receive their payments at different times. The randomisation implies that the spending of a household when it receives a payment relative to the spending of a household that does not receive a payment at the same time reveals the additional spending caused by the receipt of the stimulus payment.

So how do consumers respond to cash flow from stimulus in recessions?

First, we find that the arrival of a payment causes a large spike in spending the week that the payment arrives: 10% of spending on household goods in response to payments averaging $900 in the US in 2008 (Broda and Parker 2014).

Second, this effect decays over time, but remains present, so that cumulative effects are economically quite large – in the order of 2 to 4% of spending on household goods over the three months following arrival.

On broader measures of spending, Parker et al. (2013) find that households spend 25% of payments during the three months in which they arrive on a broad measure of nondurable goods, and roughly three-quarters of the payment in total.3 Interestingly, the difference between the two measures largely reflects spending on new cars.

Finally, the majority of spending is done by household with insufficient liquid assets to cover two months of expenditures (about 40% of households). These households spend at a rate six times that of households with sufficient liquid wealth.

Jonathan Parker

As one Nobel Prize laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter 

Things sure have changed …

13 Oct, 2016 at 18:12 | Posted in Varia | 1 Comment

 

Sherlock Holmes of the year — ‘Nobel prize’ winner Bengt Holmström

13 Oct, 2016 at 16:15 | Posted in Economics | 1 Comment

Oliver Hart and Bengt Holmström won this year’s ‘Nobel Prize’ in economics for work on applying contract theory to questions ranging from how best to reward executives to whether we should have privately owned schools and prisons or not.

Their work has according to the prize committee been “incredibly important, not just for economics, but also for other social sciences.”

Asked at a news conference about the high levels of executive pay, Holmstrom said,

It is somehow demand and supply working its magic.

Wooh! Who would have thought anything like that.

What we see happen in the US, the UK, Sweden, and elsewhere, is deeply disturbing. The rising inequality is outrageous – not the least since it has to a large extent to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Societies where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implode. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed.

And all this ‘Nobel prize’ laureate manages to come up with is demand and supply ‘magic.’

Impressive indeed …

Loanable funds theory is inconsistent with data

10 Oct, 2016 at 18:14 | Posted in Economics | 1 Comment

Loanable funds doctrine dates back to the early nineteenth century and was forcefully restated by the Swedish economist Knut Wicksell around the turn of the twentieth (with implications for inflation not pursued here). It was repudiated in 1936 by John Maynard Keynes in his General Theory. Before that he was merely a leading post-Wicksellian rather than the greatest economist of his and later times.

loanable_funds_curve-13fec80c6110b93d6d9Like Keynes, Wicksell recognized that saving and investment have different determining factors, and also thought that households provide most saving. Unlike Keynes, he argued that “the” interest rate as opposed to the level of output can adjust to assure macro balance. If potential investment falls short of saving, then the rate will, maybe with some help from inflation and the central bank, decrease. Households will save less and firms seek to invest more. The supply of loanable funds will go down and demand up, until the two flows equalize with the interest rate at its “natural” level …

Wicksell and Keynes planted red herrings for future economists by concentrating on household saving and business investment. They did not observe that trade and government injections and leakages play bigger roles in determining effective demand. Keynes made a major contribution by switching the emphasis from interest rate adjustments to changes in income as the key macroeconomic adjustment mechanism. In so doing, he argued that the interest rate and asset prices must be set in markets for stocks, not flows, of financial claims …

Today’s New “Keynesians” have tremendous intellectual firepower. The puzzle is why they revert to Wicksell on loanable funds and the natural rate while ignoring Keynes’s innovations. Maybe, as he said in the preface to the General Theory, “The difficulty lies, not in the new ideas, but in escaping from the old ones …”

Lance Taylor

De Niro says it all!

9 Oct, 2016 at 23:05 | Posted in Politics & Society | 2 Comments

 

That a country that has given us presidents like George Washington, Thomas Jefferson, Abraham Lincoln, and Franklin D. Roosevelt, should even have to consider the possibility of being run by a witless clown like Donald Trump is an absolute disgrace.

Paul Romer — favourite candidate for ‘Nobel prize’ 2016

9 Oct, 2016 at 15:51 | Posted in Economics | 5 Comments

Among Swedish economists, Paul Romer is the favourite candidate for receiving the ‘Nobel Prize’ in economics 2016. Let’s hope the prediction turns out right this time.

The ‘Nobel prize’ in economics has almost exclusively gone to mainstream economists, and most often to Chicago economists. So how refreshing it would be if for once we would have a winner who has been brave enough to openly criticize the ‘post-real’ things that emanates from the Chicago ivory tower!

Adam Smith once wrote that a really good explanation is “practically seamless.”

Is there any such theory within one of the most important field of social sciences – economic growth?

Paul Romer‘s theory presented in Endogenous Technological Change (1990) – where knowledge is made the most important driving force of growth – is probably as close as we get.

Knowledge – or ideas – are according to Romer the locomotive of growth. But as Allyn Young, Piero Sraffa and others had shown already in the 1920s, knowledge is also something that has to do with increasing returns to scale and therefore not really compatible with neoclassical economics with its emphasis on decreasing returns to scale.

Increasing returns generated by nonrivalry between ideas is simply not compatible with pure competition and the simplistic invisible hand dogma. That is probably also the reason why neoclassical economists have been so reluctant to embrace the theory whole-heartedly.

Neoclassical economics has tried to save itself by more or less substituting human capital for knowledge/ideas. But Romer’s pathbreaking ideas should not be confused with human capital. Although some have problems with the distinction between ideas and human capital in modern endogenous growth theory, this passage from Romer’s article The New Kaldor Facts: Ideas, Institutions, Population, and Human Capital gives a succinct and accessible account of the difference:

Of the three statevariables that we endogenize, ideas have been the hardest to bring into the applied general equilibrium structure. The difficulty arises because of the defining characteristic of an idea, that it is a pure nonrival good. A given idea is not scarce in the same way that land or capital or other objects are scarce; instead, an idea can be used by any number of people simultaneously without congestion or depletion.

Because they are nonrival goods, ideas force two distinct changes in our thinking about growth, changes that are sometimes conflated but are logically distinct. Ideas introduce scale effects. They also change the feasible and optimal economic institutions. The institutional implications have attracted more attention but the scale effects are more important for understanding the big sweep of human history.

The distinction between rival and nonrival goods is easy to blur at the aggregate level but inescapable in any microeconomic setting. Picture, for example, a house that is under construction. The land on which it sits, capital in the form of a measuring tape, and the human capital of the carpenter are all rival goods. They can be used to build this house but not simultaneously any other. Contrast this with the Pythagorean Theorem, which the carpenter uses implicitly by constructing a triangle with sides in the proportions of 3, 4 and 5. This idea is nonrival. Every carpenter in the world can use it at the same time to create a right angle.

Of course, human capital and ideas are tightly linked in production and use. Just as capital produces output and forgone output can be used to produce capital, human capital produces ideas and ideas are used in the educational process to produce human capital. Yet ideas and human capital are fundamentally distinct. At the micro level, human capital in our triangle example literally consists of new connections between neurons in a carpenter’s head, a rival good. The 3-4-5 triangle is the nonrival idea. At the macro level, one cannot state the assertion that skill-biased technical change is increasing the demand for education without distinguishing between ideas and human capital.

Romer’s idea about ideas is well worth a ‘Nobel Prize.’

Crash and learn?

9 Oct, 2016 at 14:07 | Posted in Economics | 22 Comments

The case for changing the way we teach economics is—or should be—obvious …

But as anyone who teaches or studies economics these days knows full well, the mainstream that has long dominated economics … is not even beginning to let go of their almost-total control over the curriculum of undergraduate and graduate programs.

ch

That’s clear from a recent article in the Financial Times, in which David Pilling asks the question, “should we change the way we teach economics?”

Me, I’ve heard the excuses not to change economics for decades now …

Here’s one—the idea that heterodox economics is like creationism, in disputing the “immutable laws” captured by mainstream theory:

Pontus Rendahl teaches macroeconomic theory at Cambridge. He doesn’t disagree that students should be exposed to economic history and to ideas that challenge neoclassical thinking … He is wary, however, of moving to a pluralist curriculum in which different schools of thought are given similar weight.”

“Pluralism is a nicely chosen word,” he says. “But it’s the same argument as the creationists in the US who say that natural selection is just a theory.” Since mainstream economics has “immutable laws”, he argues, it would be wrong to teach heterodox theories as though they had equal validity. “In the same way, I don’t think heterodox engineering or alternative medicine should be taught.”

Rendahl also argues that students are too critical of the models they encounter as undergraduates:

When we start teaching economics, we have to teach the nuts and bolts.” He introduces first-year students to the Robinson Crusoe model, in which there is only one “representative agent”. Later on, Friday is brought on the scene so the two can start trading, although no money changes hands since transactions are solely by barter” …

The assumptions built into each and every one of these defenses of mainstream economics and attacks on heterodox economic theories as well as any hint of pluralism in the teaching of economics are, at best, outdated—the leftovers from positivism and other forms of post-Enlightenment scientism. They comprise the “spontaneous philosophy” of mainstream economists who have exercised hegemony in the practice and teaching of economics throughout the postwar period.

And, yes, Pilling is right, when that hegemony is challenged, as it has been by economics students and many economists in recent years, “the clash of ideas gets nasty.”

David Ruccio

Once Cambridge was known for its famous economists. People like John Maynard Keynes. Nowadays it’s rather infamous for its economists inhabiting a neoclassical world of delusion. People like Pontus Rendahl.

Rethinking philosophy of economics

8 Oct, 2016 at 11:13 | Posted in Theory of Science & Methodology | Comments Off on Rethinking philosophy of economics

marques

In this book Gustavo Marqués, one of our discipline’s most dexterous and acute minds, calmly investigates in depth economics’ most persistent methodological enigmas. Chapter Three alone is sufficient reason for owning this book.
Edward Fullbrook, University of the West of England

Is ‘mainstream philosophy of economics’ only about models and imaginary worlds created to represent economic theories? Marqués questions this epistemic focus and calls for the ontological examination of real world economic processes. This book is a serious challenge to standard thinking and an alternative program for a pluralist philosophy of economics.
John Davis, Marquette University

Exposing the ungrounded pretensions of the mainstream philosophy of economics, Marqués’ carefully argued book is a major contribution to the ongoing debate on contemporary mainstream economics and its methodological and philosophical underpinnings. Even those who disagree with his conclusions will benefit from his thorough and deep critique of the modeling strategies used in modern economics.
Lars P Syll, Malmö University

Fact and fiction in economics

8 Oct, 2016 at 10:31 | Posted in Economics | 3 Comments

The idea that we can safely neglect the aggregate demand function is fundamental to [classical] economics …

general_theory_of_employment__interest_and_money_-_j_m__keynes-661777.jpgBut although the doctrine itself has remained unquestioned by orthodox economists up to a late date, its signal failure for purposes of scientific prediction has greatly impaired, in the course of time, the prestige of its practitioners.For professional economists, after Malthus, were apparently unmoved by the lack of correspondence between the results of their theory and the facts of observation;—a discrepancy which the ordinary man has not failed to observe, with the result of his growing unwillingness to accord to economists that measure of respect which he gives to other groups of scientists whose theoretical results are confirmed by observation when they are applied to the facts …

It may well be that the classical theory represents the way in which we should like our economy to behave. But to assume that it actually does so is to assume our difficulties away.

The main problem with mainstream economics

7 Oct, 2016 at 17:38 | Posted in Theory of Science & Methodology | Comments Off on The main problem with mainstream economics

Many economists have over time tried to diagnose what’s the problem behind the ‘intellectual poverty’ that characterizes modern mainstream economics. Rationality postulates, rational expectations, market fundamentalism, general equilibrium, atomism, over-mathematisation are some of the things one have been pointing at. But although these assumptions/axioms/practices are deeply problematic, they are mainly reflections of a deeper and more fundamental problem.

c9dd533b1cb4e7a2e1d6569481907beeThe main problem with mainstream economics is its methodology.

The fixation on constructing models showing the certainty of logical entailment has been detrimental to the development of a relevant and realist economics. Insisting on formalistic (mathematical) modeling forces the economist to give upon on realism and substitute axiomatics for real world relevance. The price for rigour and precision is far too high for anyone who is ultimately interested in using economics to pose and (hopefully) answer real world questions and problems.

This deductivist orientation is the main reason behind the difficulty that mainstream economics has in terms of understanding, explaining and predicting what takes place in our societies. But it has also given mainstream economics much of its discursive power – at least as long as no one starts asking tough questions on the veracity of – and justification for – the assumptions on which the deductivist foundation is erected. Asking these questions is an important ingredient in a sustained critical effort at showing how nonsensical is the embellishing of a smorgasbord of models founded on wanting (often hidden) methodological foundations.

The mathematical-deductivist straitjacket used in mainstream economics presupposes atomistic closed-systems – i.e., something that we find very little of in the real world, a world significantly at odds with an (implicitly) assumed logic world where deductive entailment rules the roost. Ultimately then, the failings of modern mainstream economics has its root in a deficient ontology. The kind of formal-analytical and axiomatic-deductive mathematical modeling that makes up the core of mainstream economics is hard to make compatible with a real-world ontology. It is also the reason why so many critics find mainstream economic analysis patently and utterly unrealistic and irrelevant.

Although there has been a clearly discernible increase and focus on “empirical” economics in recent decades, the results in these research fields have not fundamentally challenged the main deductivist direction of mainstream economics. They are still mainly framed and interpreted within the core “axiomatic” assumptions of individualism, instrumentalism and equilibrium that make up even the “new” mainstream economics. Although, perhaps, a sign of an increasing – but highly path-dependent – theoretical pluralism, mainstream economics is still, from a methodological point of view, mainly a deductive project erected on a foundation of empty formalism.

If we want theories and models to confront reality there are obvious limits to what can be said “rigorously” in economics. For although it is generally a good aspiration to search for scientific claims that are both rigorous and precise, we have to accept that the chosen level of precision and rigour must be relative to the subject matter studied. An economics that is relevant to the world in which we live can never achieve the same degree of rigour and precision as in logic, mathematics or the natural sciences. Collapsing the gap between model and reality in that way will never give anything else than empty formalist economics.

In mainstream economics, with its addiction to the deductivist approach of formal- mathematical modeling, model consistency trumps coherence with the real world. That is sure getting the priorities wrong. Creating models for their own sake is not an acceptable scientific aspiration – impressive-looking formal-deductive models should never be mistaken for truth.

For many people, deductive reasoning is the mark of science: induction – in which the argument is derived from the subject matter – is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. Scientific progress … is frequently the result of observation that something does work, which runs far ahead of any understanding of why it works.

aimageNot within the economics profession. There, deductive reasoning based on logical inference from a specific set of a priori deductions is “exactly the right way to do things”. What is absurd is not the use of the deductive method but the claim to exclusivity made for it. This debate is not simply about mathematics versus poetry. Deductive reasoning necessarily draws on mathematics and formal logic: inductive reasoning, based on experience and above all careful observation, will often make use of statistics and mathematics …

The belief that models are not just useful tools but are capable of yielding comprehensive and universal descriptions of the world blinded proponents to realities that had been staring them in the face. That blindness made a big contribution to our present crisis, and conditions our confused responses to it.

John Kay

It is still a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models is beyond my imagination. As long as mainstream economists do not come up with any export-licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism!

Studying mathematics and logics is interesting and fun. It sharpens the mind. In pure mathematics and logics we do not have to worry about external validity. But economics is not pure mathematics or logics. It’s about society. The real world. Forgetting that, economics is really in dire straits.

keynes-right-and-wrong

When applying deductivist thinking to economics, economists usually set up “as if” models based on a set of tight axiomatic assumptions from which consistent and precise inferences are made. The beauty of this procedure is of course that if the axiomatic premises are true, the conclusions necessarily follow. The snag is that if the models are to be relevant, we also have to argue that their precision and rigour still holds when they are applied to real-world situations. They often don’t. When addressing real economies, the idealizations necessary for the deductivist machinery to work, simply don’t hold.

So how should we evaluate the search for ever greater precision and the concomitant arsenal of mathematical and formalist models? To a large extent, the answer hinges on what we want our models to perform and how we basically understand the world.

For Keynes the world in which we live is inherently uncertain and quantifiable probabilities are the exception rather than the rule. To every statement about it is attached a “weight of argument” that makes it impossible to reduce our beliefs and expectations to a one-dimensional stochastic probability distribution. If “God does not play dice” as Einstein maintained, Keynes would add “nor do people”. The world as we know it, has limited scope for certainty and perfect knowledge. Its intrinsic and almost unlimited complexity and the interrelatedness of its organic parts prevent the possibility of treating it as constituted by “legal atoms” with discretely distinct, separable and stable causal relations. Our knowledge accordingly has to be of a rather fallible kind.

To search for precision and rigour in such a world is self-defeating, at least if precision and rigour are supposed to assure external validity. The only way to defend such an endeavour is to take a blind eye to ontology and restrict oneself to prove things in closed model-worlds. Why we should care about these and not ask questions of relevance is hard to see. We have to at least justify our disregard for the gap between the nature of the real world and our theories and models of it.

Keynes once wrote that economics “is a science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world.” Now, if the real world is fuzzy, vague and indeterminate, then why should our models build upon a desire to describe it as precise and predictable? Even if there always has to be a trade-off between theory-internal validity and external validity, we have to ask ourselves if our models are relevant.

Models preferably ought to somehow reflect/express/correspond to reality. I’m not saying that the answers are self-evident, but at least you have to do some philosophical under-labouring to rest your case. Too often that is wanting in modern economics, just as it was when Keynes in the 1930s complained about Tinbergen’s and other econometricians lack of justifications of the chosen models and methods.

“Human logic” has to supplant the classical, formal, logic of deductivism if we want to have anything of interest to say of the real world we inhabit. Logic is a marvellous tool in mathematics and axiomatic-deductivist systems, but a poor guide for action in real-world systems, in which concepts and entities are without clear boundaries and continually interact and overlap. In this world I would say we are better served with a methodology that takes into account that “the more we know the more we know we don’t know”.

The models and methods we choose to work with have to be in conjunction with the economy as it is situated and structured. Epistemology has to be founded on ontology. Deductivist closed-system theories, as all the varieties of the Walrasian general equilibrium kind, could perhaps adequately represent an economy showing closed-system characteristics. But since the economy clearly has more in common with an open-system ontology we ought to look out for other theories – theories who are rigorous and precise in the meaning that they can be deployed for enabling us to detect important causal mechanisms, capacities and tendencies pertaining to deep layers of the real world.

the-first-principle-isRigour, coherence and consistency have to be defined relative to the entities for which they are supposed to apply. Too often they have been restricted to questions internal to the theory or model. But clearly the nodal point has to concern external questions, such as how our theories and models relate to real-world structures and relations. Applicability rather than internal validity ought to be the arbiter of taste.

So – if we want to develop a new and better economics we have to give up on the deductivist straitjacket methodology. To focus scientific endeavours on proving things in models, is a gross misapprehension of what an economic theory ought to be about. Deductivist models and methods disconnected from reality are not relevant to predict, explain or understand real world economies.

9cd96aa91f19fc114925e9fee39fec91-7bfa9c38b49c28419a691e9014154aaf

If economics is going to be useful, it has to change its methodology. Economists have to get out of their deductivist theoretical ivory towers and start asking questions about the real world. A relevant economics science presupposes adopting methods suitable to the object it is supposed to predict, explain or understand.

Economics departments need to install smoke detectors …

6 Oct, 2016 at 19:32 | Posted in Economics | Comments Off on Economics departments need to install smoke detectors …

Balliol Croft, Cambridge
27. ii. 06
My dear Bowley …

marshallI know I had a growing feeling in the later years of my work at the subject that a good mathematical theorem dealing with economic hypotheses was very unlikely to be good economics: and I went more and more on the rules — (1) Use mathematics as a short-hand language, rather than as an engine of inquiry. (2) Keep to them till you have done. (3) Translate into English. (4) Then illustrate by examples that are important in real life. (5) Burn the mathematics. (6) If you can’t succeed in 4, burn 3. This last I did often …

Your emptyhandedly,

Alfred Marshall

There ought to be an enormous amount of burning going on at economics departments today. The market for smoke detectors must be peaking …

Olivier Blanchard’s second thoughts

5 Oct, 2016 at 18:01 | Posted in Economics | 4 Comments

Olivier Blanchard has had some Further Thoughts on DSGE Models and now comes up with the view that

1643-lebowski-jpg-610x0Macroeconomics is about general equilibrium …

The specific role of DSGEs in the panoply of general equilibrium models is to provide a basic macroeconomic Meccano set, i.e. a formal, analytical platform for discussion and integration of new elements …

The only way in which DSGEs can play this role is if they are built on explicit micro foundations.

Well, if Blanchard were right on this, then economics is really in serious troubles.

Rather than being necessary requisites for progress, general equilibrium and microfoundations are the main barriers to progress in macroeconomics!

Almost a century and a half after Léon Walras founded general equilibrium theory, economists still have not been able to show that markets lead economies to equilibria. We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. But one has to ask oneself — what good does that do?

As long as we cannot show that there are convincing reasons to suppose there are forces which lead economies to equilibria — the value of general equilibrium theory is nil. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming Santa Claus conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids. And real scientists are grown-ups.

Continuing to model a world full of agents behaving as economists — “often wrong, but never uncertain” — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away), is a gross misallocation of intellectual resources and time.

And then, of course, there is Sonnenschein-Mantel-Debreu!

Sonnenschein-Mantel-Debreu ultimately explains why “modern neoclassical economics”  with its microfounded DSGE macromodels are such bad substitutes for real macroeconomic analysis.

These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the Sonnenschein-Mantel-Debreu theorem points to. Representative agent models are rather an evasion whereby issues of distribution, coordination, heterogeneity — everything that really defines macroeconomics — are swept under the rug.

Microfoundations – and a fortiori rational expectations and representative agents – serve a particular theoretical purpose. And as the history of macroeconomics during the last thirty years has shown, this Lakatosian microfoundation programme for macroeconomics is only methodologically consistent within the framework of a (deterministic or stochastic) general equilibrium analysis. In no other context has it been possible to incorporate these kind of microfoundations, with its “forward-looking optimizing individuals,” into macroeconomic models.

This is of course not by accident. General equilibrium theory is basically nothing else than an endeavour to consistently generalize the microeconomics of individuals and firms on to the macroeconomic level of aggregates.

But it obviously doesn’t work!

The analogy between microeconomic behaviour and macroeconomic behaviour is misplaced. Empirically, science-theoretically and methodologically, neoclassical microfoundations for macroeconomics are defective. Tenable foundations for macroeconomics really have to be sought for elsewhere.

Instead of basing macroeconomics on unreal and unwarranted generalizations of microeconomic behaviour and relations, it is far better to accept the ontological fact that the future to a large extent is uncertain, and rather conduct macroeconomics on this fact of reality.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming uncertainty to be reducible to stochastic risk. That is scientific cheating. And it has been going on for too long now.

The sooner we are intellectually honest and ready to admit that the microfoundationalist programme has come to way’s end – the sooner we can redirect are aspirations and knowledge in more fruitful endeavours.

microfoundations3The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a “constructionist” one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe … At each stage entirely new laws, concepts, and generalizations are necessary, requiring inspiration and creativity to just as great a degree as in the previous one …

In closing I offer two examples from economics of what I hope to have said. Marx said that quantitative differences become qualitative ones, but a dialogue in Paris in the 1920’s sums it up even more clearly:

FITZGERALD: The rich are different from us.

HEMINGWAY: Yes, they have more money.

 P.W. Anderson  (Nobel Prize winner in physics 1977)

 

Economics — a contested space

5 Oct, 2016 at 16:16 | Posted in Economics | Comments Off on Economics — a contested space

svspooner_bus-420x0Neoliberals try to close down the space of political debate and social possibility by excluding all except neoliberal ideas. The tragedy of the past 40 years is they have been succeeding. In the academy there is a neoclassical monopoly, and in politics Labor and Social Democratic parties have been captured by the Trojan horse of the Third Way, creating a neoliberal political monopoly.

Reversing this state of affairs is a massive challenge. The academy is a club that will refuse to include those who disagree, and politics has been significantly captured by the one percent owing to the importance of money in politics. That is a toxic combination: the academy delegitimizes ideas opposed to neoliberalism, while the neoliberal political monopoly blocks alternative ideas getting on to the political table …

I am a great fan of the student movement for change in economics. Their case is right. However, I fear the club of academic economists will either belittle the students, ignore them, or deceptively disarm them by appointing milquetoast critical economists who produce “gattopardo” change (i.e. change that keeps things the same).

Tom Palley

My favourite girls (personal)

5 Oct, 2016 at 12:57 | Posted in Economics | Comments Off on My favourite girls (personal)

my girls

Hedda (3), Linnea (17), and Tora (23)

Why economists are useless at forecasting

4 Oct, 2016 at 10:08 | Posted in Economics | 4 Comments

We forget – or willfully ignore – that our models are simplifications of the world …

nate silverOne of the pervasive risks that we face in the information age … is that even if the amount of knowledge in the world is increasing, the gap between what we know and what we think we know may be widening. This syndrome is often associated with very precise-seeming predictions that are not at all accurate … This is like claiming you are a good shot because your bullets always end up in about the same place — even though they are nowhere near the target …

Financial crises – and most other failures of prediction – stem from this false sense of confidence. Precise forecasts masquerade as accurate ones, and some of us get fooled and double-down our bets …

Now consider what happened in November 2007. It was just one month before the Great Recession officially began …

Economists in the Survey of Professional Forecasters, a quarterly poll put out by the Federal Reserve Bank of Philadelphia, nevertheless foresaw a recession as relatively unlikely. Intead, they expected the economy to grow at a just slightly below average rate of 2.4 percent in 2008 … This was a very bad forecast: GDP actually shrank by 3.3 percent once the financial crisis hit. What may be worse is that the economists were extremely confident in their prediction. They assigned only a 3 percent chance to the economy’s shrinking by any margin over the whole of 2008 …

Indeed, economists have for a long time been much to confident in their ability to predict the direction of the economy … Their predictions have not just been overconfident but also quite poor in a real-world sense … Economic forecasters get more feedback than people in most other professions, but they haven’t chosen to correct for their bias toward overconfidence.

The wisdom of crowds

3 Oct, 2016 at 20:50 | Posted in Theory of Science & Methodology | 1 Comment

 

‘Rational expectations’ is wrong

3 Oct, 2016 at 17:53 | Posted in Economics | 2 Comments

Lynn Parramore: It seems obvious that both fundamentals and psychology matter. Why haven’t economists developed an approach to modeling stock-price movements that incorporates both?

Roman Frydman: It took a while to realize that the reason is relatively straightforward. Economists have relied on models that assume away unforeseeable change. As different as they are, rational expectations and behavioral-finance models represent the market with what mathematicians call a probability distribution – a rule that specifies in advance the chances of absolutely everything that will ever happen.

In a world in which nothing unforeseen ever happened, rational individuals could compute precisely whatever they had to know about the future to make profit-maximizing decisions. Presuming that they do not fully rely on such computations and resort to psychology would mean that they forego profit opportunities.

LP: So this is why I often hear that supporters of the Rational Expectations Hypothesis imagine people as autonomous agents that mechanically make decisions in order to maximize profits?

fubYes! What has been misunderstood is that this purely computational notion of economic rationality is an artifact of assuming away unforeseeable change.

Imagine that I have a probabilistic model for stock prices and dividends, and I hypothesize that my model shows how prices and dividends actually unfold. Now I have to suppose that rational people will have exactly the same interpretation as I do — after all, I’m right and I have accounted for all possibilities … This is essentially the idea underpinning the Rational Expectations Hypothesis …

LP: So the only truth is the non-existence of the one true model?

RF: It’s the genuine openness that makes our ideas – and education – more exciting. Students can think about things in an open, yet structured way. We don’t lose the structure; we just renounce the pretense of exact knowledge …

Economists may fear that acknowledging this limit would make economic analysis unscientific. But that fear is rooted in a misconception of what the social scientific enterprise should be. Scientific knowledge generates empirically relevant regularities that are likely to be durable. In economics, that knowledge can only be qualitative, and grasping this insight is essential to its scientific status.  Until now, we have been wasting time looking for a model that would tell us exactly how the market works.

LP: Chasing the Holy Grail?

RF: Yes. It’s an illusion. We’ve trained generation after generation in this fruitless task, and it leads to extreme thinking.

Huffington Post

2-format2010Roman Frydman is Professor of Economics at New York University and a long time critic of the rational expectations hypothesis. In his seminal 1982 American Economic Review article Towards an Understanding of Market Processes: Individual Expectations, Learning, and Convergence to Rational Expectations Equilibrium — an absolute must-read for anyone with a serious interest in understanding what are the issues in the present discussion on rational expectations as a modeling assumption — he showed that models founded on the rational expectations hypothesis are inadequate as representations of economic agents’ decision making.

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As yours truly has tried to show in On the use and misuse of theories and models in economics there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations in the dustbin of pseudo-science.

For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand waving that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming rational expectations and treating uncertainty as if it was possible to reduce it to stochastic risk. That is scientific cheating. And it has been going on for too long now.

Cassidy: What about the rational-expectations hypothesis, the other big theory associated with modern Chicago? How does that stack up now?

Heckman: I could tell you a story about my friend and colleague Milton Friedman. In the nineteen-seventies, we were sitting in the Ph.D. oral examination of a Chicago economist who has gone on to make his mark in the world. His thesis was on rational expectations. After he’d left, Friedman turned to me and said, “Look, I think it is a good idea, but these guys have taken it way too far.”

CarriedAwayIt became a kind of tautology that had enormously powerful policy implications, in theory. But the fact is, it didn’t have any empirical content. When Tom Sargent, Lard Hansen, and others tried to test it using cross equation restrictions, and so on, the data rejected the theories. There were a certain section of people that really got carried away. It became quite stifling.

Cassidy: What about Robert Lucas? He came up with a lot of these theories. Does he bear responsibility?

Heckman: Well, Lucas is a very subtle person, and he is mainly concerned with theory. He doesn’t make a lot of empirical statements. I don’t think Bob got carried away, but some of his disciples did. It often happens. The further down the food chain you go, the more the zealots take over.

John Cassidy/The New Yorker

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.