## On the proper use of mathematics in economics

21 October, 2016 at 10:01 | Posted in Economics | 1 CommentOne must, of course, beware of expecting from this method more than it can give. Out of the crucible of calculation comes not an atom more truth than was put in. The assumptions being hypothetical, the results obviously cannot claim more than a vey limited validity. The mathematical expression ought to facilitate the argument, clarify the results, and so guard against possible faults of reasoning — that is all.

It is, by the way, evident that the

economicaspects must be the determining ones everywhere: economic truth must never be sacrificed to the desire for mathematical elegance.

## Econometrics and the axiom of correct specification

20 October, 2016 at 17:22 | Posted in Statistics & Econometrics | 2 CommentsMost work in econometrics and regression analysis is — still — made on the assumption that the researcher has a theoretical model that is ‘true.’ Based on this belief of having a correct specification for an econometric model or running a regression, one proceeds as if the only problem remaining to solve have to do with measurement and observation.

When things sound to good to be true, they usually aren’t. And that goes for econometric wet dreams too. The snag is, of course, that there is pretty little to support the perfect specification assumption. Looking around in social science and economics we don’t find a single regression or econometric model that lives up to the standards set by the ‘true’ theoretical model — and there is pretty little that gives us reason to believe things will be different in the future.

To think that we are being able to construct a model where all relevant variables are included and correctly specify the functional relationships that exist between them, is not only a belief without support, but a belief *impossible* to support.

The theories we work with when building our econometric regression models are insufficient. No matter what we study, there are always some variables missing, and we don’t know the correct way to functionally specify the relationships between the variables.

*Every* regression model constructed is misspecified. There are always an endless list of possible variables to include, and endless possible ways to specify the relationships between them. So every applied econometrician comes up with his own specification and ‘parameter’ estimates. The econometric Holy Grail of consistent and stable parameter-values is nothing but a dream.

In order to draw inferences from data as described by econometric texts, it is necessary to make whimsical assumptions. The professional audience consequently and properly withholds belief until an inference is shown to be adequately insensitive to the choice of assumptions. The haphazard way we individually and collectively study the fragility of inferences leaves most of us unconvinced that any inference is believable. If we are to make effective use of our scarce data resource, it is therefore important that we study fragility in a much more systematic way. If it turns out that almost all inferences from economic data are fragile, I suppose we shall have to revert to our old methods …

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. Parameter-values estimated in specific spatio-temporal contexts are *presupposed* to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

That models should correspond to reality is, after all, a useful but not totally straightforward idea – with some history to it. Developing appropriate models is a serious problem in statistics; testing the connection to the phenomena is even more serious …

In our days, serious arguments have been made from data. Beautiful, delicate theorems have been proved, although the connection with data analysis often remains to be established. And

an enormous amount of fiction has been produced, masquerading as rigorous science.

The theoretical conditions that have to be fulfilled for regression analysis and econometrics to really work are nowhere even closely met in reality. Making outlandish statistical assumptions does not provide a solid ground for doing relevant social science and economics. Although regression analysis and econometrics have become the most used quantitative methods in social sciences and economics today, it’s still a fact that the inferences made from them are invalid.

Regression models have some serious weaknesses. Their ease of estimation tends to suppress attention to features of the data that matching techniques force researchers to consider, such as the potential heterogeneity of the causal effect and the alternative distributions of covariates across those exposed to different levels of the cause. Moreover, the traditional exogeneity assumption of regression … often befuddles applied researchers … As a result, regression practitioners can too easily accept their hope that the specification of plausible control variables generates as-if randomized experiment.

Econometrics — and regression analysis — is basically a deductive method. Given the assumptions (such as manipulability, transitivity, separability, additivity, linearity, etc) it delivers deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. Conclusions can only be as certain as their premises — and that also applies to econometrics and regression analysis.

## DSGE modeling – a statistical critique

19 October, 2016 at 16:36 | Posted in Statistics & Econometrics | Leave a commentAs Paul Romer’s recent assault on ‘post-real’ macroeconomics showed, yours truly is not the only one that questions the validity and relevance of DSGE modeling. After having read one of my posts on the issue, eminent statistician Aris Spanos kindly sent me a working paper where he discusses the validity of DSGE models and shows that the calibrated structural models are often at odds with observed data, and that many of the ‘deep parameters’ used are not even identifiable.

Interesting reading. And confirming, once again, that DSGE models do not marry particularly well with real world data. This should come as no surprise — microfounded general equilibrium modeling with inter-temporally optimizing representative agents seldom do.

This paper brings out several weaknesses of the traditional DSGE modeling, including statistical misspecification, non-identification of deep parameters, substantive inadequacy, weak forecasting performance and potentially misleading policy analysis. It is argued that most of these weaknesses stem from failing to distinguish between statistical and substantive adequacy and secure the former before assessing the latter. The paper untangles the statistical from the substantive premises of inference with a view to delineate the above mentioned problems and suggest solutions. The critical appraisal is based on the Smets and Wouters (2007) DSGE model using US quarterly data. It is shown that this model is statistically misspecified …

Lucas’s (1980) argument: “Any model that is well enough articulated to give clear answers to the questions we put to it will necessarily be artificial, abstract, patently ‘unreal’” (p. 696), is misleading because it blurs the distinction between substantive and statistical adequacy. There is nothing wrong with constructing a simple, abstract and idealised theory model aiming to capture key features of the phenomenon of interest, with a view to shed light on (understand, explain, forecast) economic phenomena of interest, as well as gain insight concerning alternative policies. Unreliability of inference problems arise when the statistical model implicitly specified by the theory model is statistically misspecified, and no attempt is made to reliably assess whether the theory model does, indeed, capture the key features of the phenomenon of interest; see Spanos (2009a). As argued by Hendry (2009):

“This implication is not a tract for mindless modeling of data in the absence of eco- nomic analysis, but instead suggests formulating more general initial models that embed the available economic theory as a special case, consistent with our knowledge of the institutional framework, historical record, and the data properties … Applied econometrics cannot be conducted without an economic theoretical framework to guide its endevous and help interpret its findings. Nevertheless, since economic theory is not complete, correct, and immutable, and never will be, one also cannot justify an insistence on deriving empirical models from theory alone.” (p. 56-7)

Statistical misspecification is not the inevitable result of abstraction and simplification, but it stems from imposing invalid probabilistic assumptions on the data.

## Rational choice theory …

19 October, 2016 at 08:59 | Posted in Economics | 3 CommentsIn economics it is assumed that people make rational choices

## Econometric objectivity …

18 October, 2016 at 10:16 | Posted in Statistics & Econometrics | 2 CommentsIt is clearly the case that experienced modellers could easily come up with significantly different models based on the same set of data thus undermining claims to researcher-independent objectivity. This has been demonstrated empirically by Magnus and Morgan (1999) who conducted an experiment in which an apprentice had to try to replicate the analysis of a dataset that might have been carried out by three different experts (Leamer, Sims, and Hendry) following their published guidance. In all cases the results were different from each other, and different from that which would have been produced by the expert, thus demonstrating the importance of tacit knowledge in statistical analysis.

Magnus and Morgan conducted a further experiment which involved eight expert teams, from different universities, analysing the same sets of data each using their own particular methodology. The data concerned the demand for food in the US and in the Netherlands and was based on a classic study by Tobin (1950) augmented with more recent data. The teams were asked to estimate the income elasticity of food demand and to forecast per capita food consumption. In terms of elasticities, the lowest estimates were around 0.38 whilst the highest were around 0.74 – clearly vastly different especially when remembering that these were based on

the same sets of data. The forecasts were perhaps even more extreme – from a base of around 4000 in 1989 the lowest forecast for the year 2000 was 4130 while the highest was nearly 18000!

## Sweden’s growing housing bubble

16 October, 2016 at 16:18 | Posted in Economics | 5 CommentsHouse prices are increasing fast in EU. And more so in Sweden than in any other member state, as shown in the Eurostat graph below, showing percentage increase in annually deflated house price index by member state 2015:

Sweden’s house price boom started in mid-1990s, and looking at the development of real house prices during the last three decades there are reasons to be deeply worried. The indebtedness of the Swedish household sector has also risen to alarmingly high levels:

Yours truly has been trying to argue with ‘very serious people’ that it’s really high time to ‘take away the punch bowl.’ Mostly I have felt like the voice of one calling in the desert.

Where do housing bubbles come from? There are of course many different explanations, but one of the fundamental mechanisms at work is that people expect house prices to increase, which makes people willing to keep on buying houses at steadily increasing prices. It’s this kind of self-generating cumulative process à la Wicksell-Myrdal that is the core of the housing bubble. Unlike the usual commodities markets where demand curves usually point downwards, on asset markets they often point upwards, and therefore give rise to this kind of instability. And, the greater leverage, the greater the increase in prices.

What is especially worrying is that although the aggregate net asset position of the Swedish households is still on the solid side, an increasing proportion of those assets is illiquid. When the inevitable drop in house prices hits the banking sector and the rest of the economy, the consequences will be enormous. It hurts when bubbles burst …

## Probability and rationality — trickier than you may think

15 October, 2016 at 23:05 | Posted in Statistics & Econometrics | 37 Comments**The Coin-tossing Problem**

My friend Ben says that on the first day he got the following sequence of Heads and Tails when tossing a coin:

H H H H H H H H H H

And on the second day he says that he got the following sequence:

H T T H H T T H T H

Which day-report makes you suspicious?

Most people I ask this question says the first day-report looks suspicious.

But actually both days are equally probable! Every time you toss a (fair) coin there is the same probability (50 %) of getting H or T. Both days Ben makes equally many tosses and every sequence is equally probable!

**The Linda Problem**

Linda is 40 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which of the following two alternatives is more probable?

A. Linda is a bank teller.

B. Linda is a bank teller and active in the feminist movement.

‘Rationally,’ alternative B cannot be more likely than alternative A. Nonetheless Amos Tversky and Daniel Kahneman reported — ‘Judgments of and by representativeness.’ In D. Kahneman, P. Slovic & A. Tversky (Eds.), *Judgment under uncertainty: Heuristics and biases.* Cambridge, UK: Cambridge University Press 1982 — that more than 80 percent of respondents said that it was.

Why do we make such ‘irrational’ judgments in both these cases? Tversky and Kahneman argued that in making this kind of judgment we seek the closest resemblance between causes and effects (in The Linda Problem, between Linda’s personality and her behaviour), rather than calculating probability, and that this makes alternative B seem preferable. By using a heuristic called *representativeness*, statement B in The Linda Problem seems more ‘representative’ of Linda based on the description of her, although from a probabilistic point of view it is clearly less likely.

## Microfoundational angels

14 October, 2016 at 17:15 | Posted in Economics | Leave a commentAmongst the several problems/disadvantages of this current consensus is that, in order to make a rational expectations, micro-founded model mathematically and analytically tractable it has been necessary in general to impose some (absurdly) simplifying assumptions, notably the existence of representative agents, who never default.This latter (nonsensical) assumption goes under the jargon term as the transversality condition.

This makes

allagents perfectly creditworthy. Over any horizon there is only one interest rate facing all agents, i.e. no risk premia. All transactions can be undertaken in capital markets; there is no role for banks. Since all IOUs are perfectly creditworthy, there is no need for money. There are no credit constraints. Everyone is angelic; there is no fraud; and this is supposed to be properly micro-founded!

## Svensk skola — ett fullständigt haveri

14 October, 2016 at 16:56 | Posted in Education & School | Leave a commentSen 2014 har andelen toppbetyg fördubblats från 0,25 procent till dagens 0,5 procent (knappt 500 elever). Denna andel är nästan 40 (!) gånger så stor som andelen med maximala betyg (5,0) under det relativa betygssystemet i början av 1990-talet …

Det finns flera problem med denna utveckling. När betygen stiger och fler slår i taket så försvåras urvalet till vidare studier och varje enskilt betyg blir helt avgörande för huruvida eleven hamnar i den växande gruppen maxbetygare. Det är också känt att kunskapsnivån faller när det är lätt att få höga betyg. Ett ännu större problem är dock den bristande likvärdigheten; samma kunskapsnivå kan ge helt olika betyg beroende på vilken skola en elev går på och vilken lärare eleven har …

För alla någorlunda insatta betraktare torde det vid det här laget står klart att det svenska betygssystemet saknar alla spärrar och att betygen stiger utan hämningar. Trots detta har inga verksamma åtgärder vidtagits för att göra något åt vare sig betygsinflationen eller den bristande likvärdigheten i betygssättningen. Blocköverskridande överenskommelser är ofta en fördel när det gäller skolfrågor, men det är att beklaga att enigheten aldrig verkar vara så stor som när det gäller att låta betygshaveriet ha sin gilla gång.

Som svensk skolforskning kunnat visa under en längre tid nu är betygsinflationen ett stort och allvarligt problem för svensk skola av idag. Men tyvärr inte det enda.

År efter år kommer larmrapporter om hur illa det står till i svensk skola. PISA och andra studier visar otvetydigt att svenska skolelever presterar sämre och sämre. Och vi som arbetar inom universitetsvärlden märker av att våra studenter i allt större utsträckning saknar nödiga förkunskaper för att kunna bedriva seriösa studier.

År efter år ser vi hur viljan att bli lärare minskar. I början på 1980-talet fanns det nästan åtta sökande per plats på lågstadielärarutbildningen. Idag är det en sökande per plats på grundlärarutbildningen. Detta är en samhällskatastrof som vi borde tala om. I en värld där allt hänger på kunskap är det på sikt avgörande för svensk ekonomi att åter göra läraryrket attraktivt.

År efter år ser vi hur lärarlönerna urholkas. För ett par år sedan presenterade OECD en rapport där man menar sig kunna visa att framgångsrika skolnationer tenderar att prioritera höga lärarlöner. Lärarlönerna som andel av BNP per capita är i Sverige väsentligt lägre än i de länder som ligger i topp i PISA-studierna.

År efter år ser vi hur ojämlikheten ökar på många områden. Inte minst vad avser inkomster och förmögenhet. Skillnader i livsbetingelser för olika grupper vad avser klass, etnicitet och genus är oacceptabelt stora.

År efter år kan vi konstatera att i skolans värld har uppenbarligen familjebakgrunden fortfarande stor betydelse för elevers prestationer. Självklart kan det inte uppfattas som annat än ett kapitalt misslyckande för en skola med kompensatoriska aspirationer.

År efter år kan vi notera att tvärtemot alla reformpedagogiska utfästelser så är det främst barn ur hem utan studietraditioner som förlorat i den omläggning i synen på skolan som skett under det senaste halvseklet. I dag – med skolpengar, fria skolval och friskolor – har utvecklingen tvärtemot alla kompensatoriska utfästelser bara ytterligare stärkt de högutbildade föräldrarnas möjligheter att styra de egna barnens skolgång och framtid. Det är svårt att se vilka som med dagens skola ska kunna göra den ‘klassresa’ så många i min generation har gjort.

Höjda lärarlöner är inte en tillräcklig förutsättning för att vi åter ska få en svensk skola av världsklass. Men det är en nödvändig förutsättning — inte minst för att locka de riktigt duktiga ungdomarna att satsa på en lärarkarriär. Omfattande skolforskning har övertygande visat att det kommunala huvudmannaskapet är en av de viktigaste orsakerna bakom lärarlönernas och den svenska skolans kräftgång de senaste decennierna.

De politisk partierna måste droppa sina ideologiska skygglappar och inse att en och annan helig ko måste slaktas om vi ska få rätt på svensk skola. När skolfakta sparkar så får man vara så god att ändra kurs – även om det eventuellt skulle stå i strid med ideologin. När ska de politiska partierna gemensamt våga ta det steget? Ska vi verkligen behöva vänta tills nästa PISA undersökning åter igen pekar på svensk skolas katastrofala utförsåkning?

Jag har sagt det förr — och jag säger det igen: kommunaliseringen av skolan är den största floppen någonsin i svensk utbildningspolitisk historia. Men misstag går att rätta till. Som den store engelske nationalekonomen John Maynard Keynes brukade säga: “When I’m wrong, I change my mind.”

Återförstatliga svensk skola!

## Ricardian equivalence — hopelessly unrealistic

14 October, 2016 at 13:36 | Posted in Economics | Leave a commentAccording to the Ricardian equivalence hypothesis the public sector basically finances its expenditures through taxes or by issuing bonds, and bonds must sooner or later be repaid by raising taxes in the future.

If the public sector runs extra spending through deficits, taxpayers will according to the hypothesis anticipate that they will have pay higher taxes in future — and therefore increase their savings and reduce their current consumption to be able to do so, the consequence being that aggregate demand would not be different to what would happen if taxes were rised today.

Robert Barro attempted to give the proposition a firm theoretical foundation in the 1970s.

So let us get the facts straight from the horse’s mouth.

Describing the Ricardian Equivalence in 1989 Barro writes (emphasis added):

Suppose now that households’ demands for goods depend on the expected present value of taxes—that is, each household subtracts its share of this present value from the expected present value of income to determine a net wealth position. Then fiscal policy would affect aggregate consumer demand only if it altered the expected present value of taxes. But the preceding argument was that the present value of taxes would not change as long as the present value of spending did not change. Therefore, the substitution of a budget deficit for current taxes (or any other rearrangement of the timing of taxes) has no impact on the aggregate demand for goods. In this sense, budget deficits and taxation have equivalent effects on the economy — hence the term, “Ricardian equivalence theorem.”

To put the equivalence result another way, a decrease in the government’s saving (that is, a current budget deficit) leads to an offsetting increase in desired private saving, and hence to no change in desired national saving.Since desired national saving does not change, the real interest rate does not have to rise in a closed economy to maintain balance between desired national saving and investment demand.

Hence, there is no effect on investment, and no burden of the public debt …

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

To most people this probably sounds like nothing but witless gibberish, and Willem Buiter is, indeed, in no gracious mood when commenting on it a couple years ago:

Barro (1974) has shown that, given perfect foresight, debt neutrality will obtain when three conditions are met: (a) private agents can lend and borrow on the same terms as the government, (b) private agents are able and willing to undo any government scheme to redistribute spending power between generations, and (c) all taxes and transfer payments are lump sum, by which we mean that their basis of assessment is independent of private agents’ decisions about production, labour supply, consumption, or asset accumulation. Under these extreme assumptions, any change in government financing (government saving or dissaving) is offset one-for-one by a corresponding change in private saving itself financed by the accompanying tax changes.

All three assumptions are of course hopelessly unrealistic. Condition (a) fails because credit rationing, liquidity constraints, large spreads between lending and borrowing rates of interest, and private borrowing rates well in excess of those enjoyed by the government are an established fact in most industrial countries. These empirical findings are underpinned by the new and burgeoning theoretical literature on asymmetric information and the implications of moral hazard and adverse selection for private financial marketsl1; and by game-theoretic insights of how active competition in financial markets can yield credit rationing as the equilibrium outcome.

Condition (b) fails because it requires either that agents must live for ever or else effectively do so through the account they take of their children and parents in making gifts and bequests. In reality, private decision horizons are finite and frequently quite short …

Condition (c) fails because in practice taxes and subsidies are rarely lump sum …

I conclude that the possible neutrality of public debt and deficits is little more than a theoretical curiosum.

It is difficult not to agree in that verdict.

And how about the empirics? Let’s have a look:

In a series of recent papers … I and co-authors measure the impact of the receipt of an economic stimulus payment in the US in 2008 on a household’s spending by comparing the spending patterns of households that receive their payments at different times. The randomisation implies that the spending of a household when it receives a payment relative to the spending of a household that does not receive a payment at the same time reveals the additional spending caused by the receipt of the stimulus payment.

So how do consumers respond to cash flow from stimulus in recessions?

First, we find that the arrival of a payment causes a large spike in spending the week that the payment arrives: 10% of spending on household goods in response to payments averaging $900 in the US in 2008 (Broda and Parker 2014).

Second, this effect decays over time, but remains present, so that cumulative effects are economically quite large – in the order of 2 to 4% of spending on household goods over the three months following arrival.

On broader measures of spending, Parker et al. (2013) find that households spend 25% of payments during the three months in which they arrive on a broad measure of nondurable goods, and roughly three-quarters of the payment in total.3 Interestingly, the difference between the two measures largely reflects spending on new cars.

Finally, the majority of spending is done by household with insufficient liquid assets to cover two months of expenditures (about 40% of households). These households spend at a rate six times that of households with sufficient liquid wealth.

As one Nobel Prize laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter

Create a free website or blog at WordPress.com.

Entries and comments feeds.