Econometrics and the axiom of correct specification

20 Oct, 2016 at 17:22 | Posted in Statistics & Econometrics | 4 Comments

Most work in econometrics and regression analysis is — still — made on the assumption that the researcher has a theoretical model that is ‘true.’ Based on this belief of having a correct specification for an econometric model or running a regression, one proceeds as if the only problem remaining to solve have to do with measurement and observation.

aWhen things sound to good to be true, they usually aren’t. And that goes for econometric wet dreams too. The snag is, of course, that there is pretty little to support the perfect specification assumption. Looking around in social science and economics we don’t find a single regression or econometric model that lives up to the standards set by the ‘true’ theoretical model — and there is pretty little that gives us reason to believe things will be different in the future.

To think that we are being able to construct a model where all relevant variables are included and correctly specify the functional relationships that exist between them, is  not only a belief without support, but a belief impossible to support.

The theories we work with when building our econometric regression models are insufficient. No matter what we study, there are always some variables missing, and we don’t know the correct way to functionally specify the relationships between the variables.

Every regression model constructed is misspecified. There are always an endless list of possible variables to include, and endless possible ways to specify the relationships between them. So every applied econometrician comes up with his own specification and ‘parameter’ estimates. The econometric Holy Grail of consistent and stable parameter-values is nothing but a dream.

overconfidenceIn order to draw inferences from data as described by econometric texts, it is necessary to make whimsical assumptions. The professional audience consequently and properly withholds belief until an inference is shown to be adequately insensitive to the choice of assumptions. The haphazard way we individually and collectively study the fragility of inferences leaves most of us unconvinced that any inference is believable. If we are to make effective use of our scarce data resource, it is therefore important that we study fragility in a much more systematic way. If it turns out that almost all inferences from economic data are fragile, I suppose we shall have to revert to our old methods …

Ed Leamer

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables.  Parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

stat That models should correspond to reality is, after all, a useful but not totally straightforward idea – with some history to it. Developing appropriate models is a serious problem in statistics; testing the connection to the phenomena is even more serious …

In our days, serious arguments have been made from data. Beautiful, delicate theorems have been proved, although the connection with data analysis often remains to be established. And an enormous amount of fiction has been produced, masquerading as rigorous science.

The theoretical conditions that have to be fulfilled for regression analysis and econometrics to really work are nowhere even closely met in reality. Making outlandish statistical assumptions does not provide a solid ground for doing relevant social science and economics. Although regression analysis and econometrics have become the most used quantitative methods in social sciences and economics today, it’s still a fact that the inferences made from them are invalid.

41ibatsefvlRegression models have some serious weaknesses. Their ease of estimation tends to suppress attention to features of the data that matching techniques force researchers to consider, such as the potential heterogeneity of the causal effect and the alternative distributions of covariates across those exposed to different levels of the cause. Moreover, the traditional exogeneity assumption of regression … often befuddles applied researchers … As a result, regression practitioners can too easily accept their hope that the specification of plausible control variables generates as-if randomized experiment.

Econometrics — and regression analysis — is basically a deductive method. Given the assumptions (such as manipulability, transitivity, separability, additivity, linearity, etc) it delivers deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. Conclusions can only be as certain as their premises — and that also applies to econometrics and regression analysis.

DSGE modeling – a statistical critique

19 Oct, 2016 at 16:36 | Posted in Statistics & Econometrics | Comments Off on DSGE modeling – a statistical critique

As Paul Romer’s recent assault on ‘post-real’ macroeconomics showed, yours truly is not the only one that questions the validity and relevance of DSGE modeling. After having read one of my posts on the issue, eminent statistician Aris Spanos kindly sent me a working paper where he discusses the validity of DSGE models and shows that the calibrated structural models are often at odds with observed data, and that many of the ‘deep parameters’ used are not even identifiable.

Interesting reading. And confirming, once again, that DSGE models do not marry particularly well with real world data. This should come as no surprise — microfounded general equilibrium modeling with inter-temporally optimizing representative agents seldom do.

reality-check-1024x682This paper brings out several weaknesses of the traditional DSGE modeling, including statistical misspecification, non-identification of deep parameters, substantive inadequacy, weak forecasting performance and potentially misleading policy analysis. It is argued that most of these weaknesses stem from failing to distinguish between statistical and substantive adequacy and secure the former before assessing the latter. The paper untangles the statistical from the substantive premises of inference with a view to delineate the above mentioned problems and suggest solutions. The critical appraisal is based on the Smets and Wouters (2007) DSGE model using US quarterly data. It is shown that this model is statistically misspecified …

Lucas’s (1980) argument: “Any model that is well enough articulated to give clear answers to the questions we put to it will necessarily be artificial, abstract, patently ‘unreal’” (p. 696), is misleading because it blurs the distinction between substantive and statistical adequacy. There is nothing wrong with constructing a simple, abstract and idealised theory model aiming to capture key features of the phenomenon of interest, with a view to shed light on (understand, explain, forecast) economic phenomena of interest, as well as gain insight concerning alternative policies. Unreliability of inference problems arise when the statistical model implicitly specified by the theory model is statistically misspecified, and no attempt is made to reliably assess whether the theory model does, indeed, capture the key features of the phenomenon of interest; see Spanos (2009a). As argued by Hendry (2009):

“This implication is not a tract for mindless modeling of data in the absence of eco- nomic analysis, but instead suggests formulating more general initial models that embed the available economic theory as a special case, consistent with our knowledge of the institutional framework, historical record, and the data properties … Applied econometrics cannot be conducted without an economic theoretical framework to guide its endevous and help interpret its findings. Nevertheless, since economic theory is not complete, correct, and immutable, and never will be, one also cannot justify an insistence on deriving empirical models from theory alone.” (p. 56-7)

Statistical misspecification is not the inevitable result of abstraction and simplification, but it stems from imposing invalid probabilistic assumptions on the data.

Niraj Poudyal & Aris Spanos

Rational choice theory …

19 Oct, 2016 at 08:59 | Posted in Economics | 3 Comments

In economics it is assumed that people make rational choices

Econometric objectivity …

18 Oct, 2016 at 10:16 | Posted in Statistics & Econometrics | 2 Comments

carterhodgkin1

It is clearly the case that experienced modellers could easily come up with significantly different models based on the same set of data thus undermining claims to researcher-independent objectivity. This has been demonstrated empirically by Magnus and Morgan (1999) who conducted an experiment in which an apprentice had to try to replicate the analysis of a dataset that might have been carried out by three different experts (Leamer, Sims, and Hendry) following their published guidance. In all cases the results were different from each other, and different from that which would have been produced by the expert, thus demonstrating the importance of tacit knowledge in statistical analysis.

Magnus and Morgan conducted a further experiment which involved eight expert teams, from different universities, analysing the same sets of data each using their own particular methodology. The data concerned the demand for food in the US and in the Netherlands and was based on a classic study by Tobin (1950) augmented with more recent data. The teams were asked to estimate the income elasticity of food demand and to forecast per capita food consumption. In terms of elasticities, the lowest estimates were around 0.38 whilst the highest were around 0.74 – clearly vastly different especially when remembering that these were based on the same sets of data. The forecasts were perhaps even more extreme – from a base of around 4000 in 1989 the lowest forecast for the year 2000 was 4130 while the highest was nearly 18000!

John Mingers

Sweden’s growing housing bubble

16 Oct, 2016 at 16:18 | Posted in Economics | 5 Comments

House prices are increasing fast in EU. And more so in Sweden than in any other member state, as shown in the Eurostat graph below, showing percentage increase in annually deflated house price index by member state 2015:

house-prices

Sweden’s house price boom started in mid-1990s, and looking at the development of real house prices during the last three decades there are reasons to be deeply worried. The indebtedness of the Swedish household sector has also risen to alarmingly high levels:

sweden-households-debt-to-gdp2x

Yours truly has been trying to argue with ‘very serious people’ that it’s really high time to ‘take away the punch bowl.’ Mostly I have felt like the voice of one calling in the desert.

Housing-bubble-markets-flatten-a-bit-530

Where do housing bubbles come from? There are of course many different explanations, but one of the fundamental mechanisms at work is  that people expect house prices to increase, which makes people willing to keep on buying  houses at steadily increasing prices. It’s this kind of self-generating cumulative process à la Wicksell-Myrdal that is the core of the housing bubble. Unlike the usual commodities markets where demand curves usually point downwards, on asset markets they often point upwards, and therefore give rise to this kind of instability. And, the greater leverage, the greater the increase in prices.

What is especially worrying is that although the aggregate net asset position of the Swedish households is still on the solid side, an increasing proportion of those assets is illiquid. When the inevitable drop in house prices hits the banking sector and the rest of the economy, the consequences will be enormous. It hurts when bubbles burst …

Probability and rationality — trickier than you may think

15 Oct, 2016 at 23:05 | Posted in Statistics & Econometrics | 40 Comments

The Coin-tossing Problem

My friend Ben says that on the first day he got the following sequence of Heads and Tails when tossing a coin:
H H H H H H H H H H

And on the second day he says that he got the following sequence:
H T T H H T T H T H

184bic9u2w483jpgWhich day-report makes you suspicious?
Most people I ask this question says the first day-report looks suspicious.

But actually both days are equally probable! Every time you toss a (fair) coin there is the same probability (50 %) of getting H or T. Both days Ben makes equally many tosses and every sequence is equally probable!

The Linda Problem

Linda is 40 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which of the following two alternatives is more probable?

A. Linda is a bank teller.
B. Linda is a bank teller and active in the feminist movement.

‘Rationally,’ alternative B cannot be more likely than alternative A. Nonetheless Amos Tversky and Daniel Kahneman reported — ‘Judgments of and by representativeness.’ In D. Kahneman, P. Slovic & A. Tversky (Eds.), Judgment under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press 1982 — that more than 80 percent of respondents said that it was.

Why do we make such ‘irrational’ judgments in both these cases? Tversky and Kahneman argued that in making this kind of judgment we seek the closest resemblance between causes and effects (in The Linda Problem, between Linda’s personality and her behaviour), rather than calculating probability, and that this makes alternative B seem preferable. By using a heuristic called representativeness, statement B in The Linda Problem seems more ‘representative’ of Linda based on the description of her, although from a probabilistic point of view it is clearly less likely.

Microfoundational angels

14 Oct, 2016 at 17:15 | Posted in Economics | Comments Off on Microfoundational angels

goodhartAmongst the several problems/disadvantages of this current consensus is that, in order to make a rational expectations, micro-founded model mathematically and analytically tractable it has been necessary in general to impose some (absurdly) simplifying assumptions, notably the existence of representative agents, who never default.This latter (nonsensical) assumption goes under the jargon term as the transversality condition.

This makes all agents perfectly creditworthy. Over any horizon there is only one interest rate facing all agents, i.e. no risk premia. All transactions can be undertaken in capital markets; there is no role for banks. Since all IOUs are perfectly creditworthy, there is no need for money. There are no credit constraints. Everyone is angelic; there is no fraud; and this is supposed to be properly micro-founded!

Charles Goodhart

Svensk skola — ett fullständigt haveri

14 Oct, 2016 at 16:56 | Posted in Education & School | Comments Off on Svensk skola — ett fullständigt haveri

Sen 2014 har andelen toppbetyg fördubblats från 0,25 procent till dagens 0,5 procent (knappt 500 elever). Denna andel är nästan 40 (!) gånger så stor som andelen med maximala betyg (5,0) under det relativa betygssystemet i början av 1990-talet …

slide_3Det finns flera problem med denna utveckling. När betygen stiger och fler slår i taket så försvåras urvalet till vidare studier och varje enskilt betyg blir helt avgörande för huruvida eleven hamnar i den växande gruppen maxbetygare. Det är också känt att kunskapsnivån faller när det är lätt att få höga betyg. Ett ännu större problem är dock den bristande likvärdigheten; samma kunskapsnivå kan ge helt olika betyg beroende på vilken skola en elev går på och vilken lärare eleven har …

För alla någorlunda insatta betraktare torde det vid det här laget står klart att det svenska betygssystemet saknar alla spärrar och att betygen stiger utan hämningar. Trots detta har inga verksamma åtgärder vidtagits för att göra något åt vare sig betygsinflationen eller den bristande likvärdigheten i betygssättningen. Blocköverskridande överenskommelser är ofta en fördel när det gäller skolfrågor, men det är att beklaga att enigheten aldrig verkar vara så stor som när det gäller att låta betygshaveriet ha sin gilla gång.

Jonas Vlachos

Som svensk skolforskning kunnat visa under en längre tid nu är betygsinflationen ett stort och allvarligt problem för svensk skola av idag. Men tyvärr inte det enda.

År efter år kommer larmrapporter om hur illa det står till i svensk skola. PISA och andra studier visar otvetydigt att svenska skolelever presterar sämre och sämre. Och vi som arbetar inom universitetsvärlden märker av att våra studenter i allt större utsträckning saknar nödiga förkunskaper för att kunna bedriva seriösa studier.

År efter år ser vi hur viljan att bli lärare minskar. I början på 1980-talet fanns det nästan åtta sökande per plats på lågstadielärarutbildningen. Idag är det en sökande per plats på grundlärarutbildningen. Detta är en samhällskatastrof som vi borde tala om. I en värld där allt hänger på kunskap är det på sikt avgörande för svensk ekonomi att åter göra läraryrket attraktivt.

År efter år ser vi hur lärarlönerna urholkas. För ett par år sedan presenterade OECD en rapport där man menar sig kunna visa att framgångsrika skolnationer tenderar att prioritera höga lärarlöner. Lärarlönerna som andel av BNP per capita är i Sverige väsentligt lägre än i de länder som ligger i topp i PISA-studierna.

År efter år ser vi hur ojämlikheten ökar på många områden. Inte minst vad avser inkomster och förmögenhet. Skillnader i livsbetingelser för olika grupper vad avser klass, etnicitet och genus är oacceptabelt stora.

År efter år kan vi konstatera att i skolans värld har uppenbarligen familjebakgrunden fortfarande stor betydelse för elevers prestationer. Självklart kan det inte uppfattas som annat än ett kapitalt misslyckande för en skola med kompensatoriska aspirationer.

tilsam

År efter år kan vi notera att tvärtemot alla reformpedagogiska utfästelser så är det främst barn ur hem utan studietraditioner som förlorat i den omläggning i synen på skolan som skett under det senaste halvseklet. I dag – med skolpengar, fria skolval och friskolor – har utvecklingen tvärtemot alla kompensatoriska utfästelser bara ytterligare stärkt de högutbildade föräldrarnas möjligheter att styra de egna barnens skolgång och framtid. Det är svårt att se vilka som med dagens skola ska kunna göra den ‘klassresa’ så många i min generation har gjort.

Höjda lärarlöner är inte en tillräcklig förutsättning för att vi åter ska få en svensk skola av världsklass. Men det är en nödvändig förutsättning — inte minst för att locka de riktigt duktiga ungdomarna att satsa på en lärarkarriär. Omfattande skolforskning har övertygande visat att det kommunala huvudmannaskapet är en av de viktigaste orsakerna bakom lärarlönernas och den svenska skolans kräftgång de senaste decennierna.

De politisk partierna måste droppa sina ideologiska skygglappar och inse att en och annan helig ko måste slaktas om vi ska få rätt på svensk skola. När skolfakta sparkar så får man vara så god att ändra kurs – även om det eventuellt skulle stå i strid med ideologin. När ska de politiska partierna gemensamt våga ta det steget? Ska vi verkligen behöva vänta tills nästa PISA undersökning åter igen pekar på svensk skolas katastrofala utförsåkning?

Jag har sagt det förr — och jag säger det igen: kommunaliseringen av skolan är den största floppen någonsin i svensk utbildningspolitisk historia. Men misstag går att rätta till. Som den store engelske nationalekonomen John Maynard Keynes brukade säga: “When I’m wrong, I change my mind.”

Återförstatliga svensk skola!

Ricardian equivalence — hopelessly unrealistic

14 Oct, 2016 at 13:36 | Posted in Economics | Comments Off on Ricardian equivalence — hopelessly unrealistic

According to the Ricardian equivalence hypothesis the public sector basically finances its expenditures through taxes or by issuing bonds, and bonds must sooner or later be repaid by raising taxes in the future.

If the public sector runs extra spending through deficits, taxpayers will according to the hypothesis anticipate that they will have pay higher taxes in future — and therefore increase their savings and reduce their current consumption to be able to do so, the consequence being that aggregate demand would not be different to what would happen if taxes were rised today.

Robert Barro attempted to give the proposition a firm theoretical foundation in the 1970s.

So let us get the facts straight from the horse’s mouth.

Describing the Ricardian Equivalence in 1989 Barro writes (emphasis added):

Suppose now that households’ demands for goods depend on the expected present value of taxes—that is, each household subtracts its share of this present value from the expected present value of income to determine a net wealth position. Then fiscal policy would affect aggregate consumer demand only if it altered the expected present value of taxes. But the preceding argument was that the present value of taxes would not change as long as the present value of spending did not change. Therefore, the substitution of a budget deficit for current taxes (or any other rearrangement of the timing of taxes) has no impact on the aggregate demand for goods. In this sense, budget deficits and taxation have equivalent effects on the economy — hence the term, “Ricardian equivalence theorem.” To put the equivalence result another way, a decrease in the government’s saving (that is, a current budget deficit) leads to an offsetting increase in desired private saving, and hence to no change in desired national saving.

Since desired national saving does not change, the real interest rate does not have to rise in a closed economy to maintain balance between desired national saving and investment demand. Hence, there is no effect on investment, and no burden of the public debt …

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

To most people this probably sounds like nothing but witless gibberish, and Willem Buiter is, indeed, in no gracious mood when commenting on it a couple years ago:

Barro (1974) has shown that, given perfect foresight, debt neutrality will obtain when three conditions are met: (a) private agents can lend and borrow on the same terms as the government, (b) private agents are able and willing to undo any government scheme to redistribute spending power between generations, and (c) all taxes and transfer payments are lump sum, by which we mean that their basis of assessment is independent of private agents’ decisions about production, labour supply, consumption, or asset accumulation. Under these extreme assumptions, any change in government financing (government saving or dissaving) is offset one-for-one by a corresponding change in private saving itself financed by the accompanying tax changes.

aatical-economics-are-mere-concoctions-as-imprecise-as-the-john-maynard-keynes-243582All three assumptions are of course hopelessly unrealistic. Condition (a) fails because credit rationing, liquidity constraints, large spreads between lending and borrowing rates of interest, and private borrowing rates well in excess of those enjoyed by the government are an established fact in most industrial countries. These empirical findings are underpinned by the new and burgeoning theoretical literature on asymmetric information and the implications of moral hazard and adverse selection for private financial marketsl1; and by game-theoretic insights of how active competition in financial markets can yield credit rationing as the equilibrium outcome.

Condition (b) fails because it requires either that agents must live for ever or else effectively do so through the account they take of their children and parents in making gifts and bequests. In reality, private decision horizons are finite and frequently quite short …

Condition (c) fails because in practice taxes and subsidies are rarely lump sum …

I conclude that the possible neutrality of public debt and deficits is little more than a theoretical curiosum.

Willem Buiter

It is difficult not to agree in that verdict.

And how about the empirics? Let’s have a look:

In a series of recent papers … I and co-authors measure the impact of the receipt of an economic stimulus payment in the US in 2008 on a household’s spending by comparing the spending patterns of households that receive their payments at different times. The randomisation implies that the spending of a household when it receives a payment relative to the spending of a household that does not receive a payment at the same time reveals the additional spending caused by the receipt of the stimulus payment.

So how do consumers respond to cash flow from stimulus in recessions?

First, we find that the arrival of a payment causes a large spike in spending the week that the payment arrives: 10% of spending on household goods in response to payments averaging $900 in the US in 2008 (Broda and Parker 2014).

Second, this effect decays over time, but remains present, so that cumulative effects are economically quite large – in the order of 2 to 4% of spending on household goods over the three months following arrival.

On broader measures of spending, Parker et al. (2013) find that households spend 25% of payments during the three months in which they arrive on a broad measure of nondurable goods, and roughly three-quarters of the payment in total.3 Interestingly, the difference between the two measures largely reflects spending on new cars.

Finally, the majority of spending is done by household with insufficient liquid assets to cover two months of expenditures (about 40% of households). These households spend at a rate six times that of households with sufficient liquid wealth.

Jonathan Parker

As one Nobel Prize laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter 

Things sure have changed …

13 Oct, 2016 at 18:12 | Posted in Varia | 1 Comment

 

Sherlock Holmes of the year — ‘Nobel prize’ winner Bengt Holmström

13 Oct, 2016 at 16:15 | Posted in Economics | 1 Comment

Oliver Hart and Bengt Holmström won this year’s ‘Nobel Prize’ in economics for work on applying contract theory to questions ranging from how best to reward executives to whether we should have privately owned schools and prisons or not.

Their work has according to the prize committee been “incredibly important, not just for economics, but also for other social sciences.”

Asked at a news conference about the high levels of executive pay, Holmstrom said,

It is somehow demand and supply working its magic.

Wooh! Who would have thought anything like that.

What we see happen in the US, the UK, Sweden, and elsewhere, is deeply disturbing. The rising inequality is outrageous – not the least since it has to a large extent to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Societies where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implode. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed.

And all this ‘Nobel prize’ laureate manages to come up with is demand and supply ‘magic.’

Impressive indeed …

Loanable funds theory is inconsistent with data

10 Oct, 2016 at 18:14 | Posted in Economics | 1 Comment

Loanable funds doctrine dates back to the early nineteenth century and was forcefully restated by the Swedish economist Knut Wicksell around the turn of the twentieth (with implications for inflation not pursued here). It was repudiated in 1936 by John Maynard Keynes in his General Theory. Before that he was merely a leading post-Wicksellian rather than the greatest economist of his and later times.

loanable_funds_curve-13fec80c6110b93d6d9Like Keynes, Wicksell recognized that saving and investment have different determining factors, and also thought that households provide most saving. Unlike Keynes, he argued that “the” interest rate as opposed to the level of output can adjust to assure macro balance. If potential investment falls short of saving, then the rate will, maybe with some help from inflation and the central bank, decrease. Households will save less and firms seek to invest more. The supply of loanable funds will go down and demand up, until the two flows equalize with the interest rate at its “natural” level …

Wicksell and Keynes planted red herrings for future economists by concentrating on household saving and business investment. They did not observe that trade and government injections and leakages play bigger roles in determining effective demand. Keynes made a major contribution by switching the emphasis from interest rate adjustments to changes in income as the key macroeconomic adjustment mechanism. In so doing, he argued that the interest rate and asset prices must be set in markets for stocks, not flows, of financial claims …

Today’s New “Keynesians” have tremendous intellectual firepower. The puzzle is why they revert to Wicksell on loanable funds and the natural rate while ignoring Keynes’s innovations. Maybe, as he said in the preface to the General Theory, “The difficulty lies, not in the new ideas, but in escaping from the old ones …”

Lance Taylor

De Niro says it all!

9 Oct, 2016 at 23:05 | Posted in Politics & Society | 2 Comments

 

That a country that has given us presidents like George Washington, Thomas Jefferson, Abraham Lincoln, and Franklin D. Roosevelt, should even have to consider the possibility of being run by a witless clown like Donald Trump is an absolute disgrace.

Paul Romer — favourite candidate for ‘Nobel prize’ 2016

9 Oct, 2016 at 15:51 | Posted in Economics | 5 Comments

Among Swedish economists, Paul Romer is the favourite candidate for receiving the ‘Nobel Prize’ in economics 2016. Let’s hope the prediction turns out right this time.

The ‘Nobel prize’ in economics has almost exclusively gone to mainstream economists, and most often to Chicago economists. So how refreshing it would be if for once we would have a winner who has been brave enough to openly criticize the ‘post-real’ things that emanates from the Chicago ivory tower!

Adam Smith once wrote that a really good explanation is “practically seamless.”

Is there any such theory within one of the most important field of social sciences – economic growth?

Paul Romer‘s theory presented in Endogenous Technological Change (1990) – where knowledge is made the most important driving force of growth – is probably as close as we get.

Knowledge – or ideas – are according to Romer the locomotive of growth. But as Allyn Young, Piero Sraffa and others had shown already in the 1920s, knowledge is also something that has to do with increasing returns to scale and therefore not really compatible with neoclassical economics with its emphasis on decreasing returns to scale.

Increasing returns generated by nonrivalry between ideas is simply not compatible with pure competition and the simplistic invisible hand dogma. That is probably also the reason why neoclassical economists have been so reluctant to embrace the theory whole-heartedly.

Neoclassical economics has tried to save itself by more or less substituting human capital for knowledge/ideas. But Romer’s pathbreaking ideas should not be confused with human capital. Although some have problems with the distinction between ideas and human capital in modern endogenous growth theory, this passage from Romer’s article The New Kaldor Facts: Ideas, Institutions, Population, and Human Capital gives a succinct and accessible account of the difference:

Of the three statevariables that we endogenize, ideas have been the hardest to bring into the applied general equilibrium structure. The difficulty arises because of the defining characteristic of an idea, that it is a pure nonrival good. A given idea is not scarce in the same way that land or capital or other objects are scarce; instead, an idea can be used by any number of people simultaneously without congestion or depletion.

Because they are nonrival goods, ideas force two distinct changes in our thinking about growth, changes that are sometimes conflated but are logically distinct. Ideas introduce scale effects. They also change the feasible and optimal economic institutions. The institutional implications have attracted more attention but the scale effects are more important for understanding the big sweep of human history.

The distinction between rival and nonrival goods is easy to blur at the aggregate level but inescapable in any microeconomic setting. Picture, for example, a house that is under construction. The land on which it sits, capital in the form of a measuring tape, and the human capital of the carpenter are all rival goods. They can be used to build this house but not simultaneously any other. Contrast this with the Pythagorean Theorem, which the carpenter uses implicitly by constructing a triangle with sides in the proportions of 3, 4 and 5. This idea is nonrival. Every carpenter in the world can use it at the same time to create a right angle.

Of course, human capital and ideas are tightly linked in production and use. Just as capital produces output and forgone output can be used to produce capital, human capital produces ideas and ideas are used in the educational process to produce human capital. Yet ideas and human capital are fundamentally distinct. At the micro level, human capital in our triangle example literally consists of new connections between neurons in a carpenter’s head, a rival good. The 3-4-5 triangle is the nonrival idea. At the macro level, one cannot state the assertion that skill-biased technical change is increasing the demand for education without distinguishing between ideas and human capital.

Romer’s idea about ideas is well worth a ‘Nobel Prize.’

Crash and learn?

9 Oct, 2016 at 14:07 | Posted in Economics | 22 Comments

The case for changing the way we teach economics is—or should be—obvious …

But as anyone who teaches or studies economics these days knows full well, the mainstream that has long dominated economics … is not even beginning to let go of their almost-total control over the curriculum of undergraduate and graduate programs.

ch

That’s clear from a recent article in the Financial Times, in which David Pilling asks the question, “should we change the way we teach economics?”

Me, I’ve heard the excuses not to change economics for decades now …

Here’s one—the idea that heterodox economics is like creationism, in disputing the “immutable laws” captured by mainstream theory:

Pontus Rendahl teaches macroeconomic theory at Cambridge. He doesn’t disagree that students should be exposed to economic history and to ideas that challenge neoclassical thinking … He is wary, however, of moving to a pluralist curriculum in which different schools of thought are given similar weight.”

“Pluralism is a nicely chosen word,” he says. “But it’s the same argument as the creationists in the US who say that natural selection is just a theory.” Since mainstream economics has “immutable laws”, he argues, it would be wrong to teach heterodox theories as though they had equal validity. “In the same way, I don’t think heterodox engineering or alternative medicine should be taught.”

Rendahl also argues that students are too critical of the models they encounter as undergraduates:

When we start teaching economics, we have to teach the nuts and bolts.” He introduces first-year students to the Robinson Crusoe model, in which there is only one “representative agent”. Later on, Friday is brought on the scene so the two can start trading, although no money changes hands since transactions are solely by barter” …

The assumptions built into each and every one of these defenses of mainstream economics and attacks on heterodox economic theories as well as any hint of pluralism in the teaching of economics are, at best, outdated—the leftovers from positivism and other forms of post-Enlightenment scientism. They comprise the “spontaneous philosophy” of mainstream economists who have exercised hegemony in the practice and teaching of economics throughout the postwar period.

And, yes, Pilling is right, when that hegemony is challenged, as it has been by economics students and many economists in recent years, “the clash of ideas gets nasty.”

David Ruccio

Once Cambridge was known for its famous economists. People like John Maynard Keynes. Nowadays it’s rather infamous for its economists inhabiting a neoclassical world of delusion. People like Pontus Rendahl.

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.