Go Canada Go!

29 October, 2014 at 22:22 | Posted in Varia | Comments Off on Go Canada Go!

 

Advertisements

Macroeconomic aspirations

29 October, 2014 at 17:00 | Posted in Economics | 5 Comments

Oxford macroeconomist Simon Wren-Lewis has a post up on his blog on the use of labels in macroeconomics:

EPAreadlabelLabels are fun, and get attention. They can be a useful shorthand to capture an idea, or related set of ideas … Here are a couple of bold assertions, which I think I believe, and which I will try to justify. First, in academic research terms there is only one meaningful division, between mainstream and heterodox … Second, in macroeconomic policy terms I think there is only one meaningful significant division, between mainstream and anti-Keynesians …

So what do I mean by a meaningful division in academic research terms? I mean speaking a different language. Thanks to the microfoundations revolution in macro, mainstream macroeconomists speak the same language. I can go to a seminar that involves an RBC model with flexible prices and no involuntary unemployment and still contribute and possibly learn something.

Wren-Lewis seems to be überjoyed by the fact that using the same language as real business cycles macroeconomists he can “possibly learn something” from them.

Hmm …

Wonder what …

I’m not sure Wren-Lewis uses the same “language” as James Tobin, but he’s definitely worth listening to:

They try to explain business cycles solely as problems of information, such as asymmetries and imperfections in the information agents have. Those assumptions are just as arbitrary as the institutional rigidities and inertia they find objectionable in other theories of business fluctuations … I try to point out how incapable the new equilibrium business cycles models are of explaining the most obvious observed facts of cyclical fluctuations … I don’t think that models so far from realistic description should be taken seriously as a guide to policy … I don’t think that there is a way to write down any model which at one hand respects the possible diversity of agents in taste, circumstances, and so on, and at the other hand also grounds behavior rigorously in utility maximization and which has any substantive content to it.

Arjo Klamer, The New Classical Mcroeconomics: Conversations with the New Classical Economists and their  Opponents,Wheatsheaf Books, 1984

And contrary to Wren-Lewis I don’t think the fact that “thanks to the microfoundations revolution in macro, mainstream macroeconomists speak the same language,” takes us very far. Far better than having a common “language” is to have a well-founded, realist and relevant theory:

Microfoundations for macroeconomics are fine in principle—not indispensable, but useful. The problem is that what passes for microfoundations in the universe of orthodox macro is crap …

realityIt’s nothing more than robotic imitation of teaching exercises to improve math skills, without any consideration for such mundane matters as empirical verisimilitude. I will mention three crushing faults, each sufficient by itself to blow a wide hole in a supposedly useful model …

It is rife with anomalies (see “behavioral economics”), and, most important, it is oblivious to the last several decades of work in psychology, evolutionary biology, neuropsychology, organization theory—all the disciplines where people study behavior in a scientific way …

There are no interaction effects to generate multiple equilibria in the microfoundations macro theorists use. Every individual, firm and product is an isolated atom, floating uninterrupted through space until it bumps into another such atom in the marketplace. Social psychology, ecology, nonconvex production and consumption spaces?  Forget about it …

Microfoundations means general equilibrium theory, but the flavor it uses is from the mid-1950s. The Sonnenschein-Debreu-Mantel demonstration (update to the 1970s)  that initial conditions and out-of-equilibrium trades alter the equilibrium itself has turned GET upside down.

Notice that I haven’t mentioned the standard heterodox criticisms of representative agents and ergodicity. You can add those if you want …

Like I said, their microfoundations are crap.

Peter Dorman

Macroeconomists have to have bigger aspirations than speaking the same “language.” Rigorous models lacking relevance is not to be taken seriously. Truly great macroeconomists aspire to explain and understand the fundamentals of modern economies. As did e. g. John Maynard Keynes and Michal Kalecki.

Dawit Isaak — Sweden’s only prisoner of conscience

27 October, 2014 at 07:53 | Posted in Economics | Comments Off on Dawit Isaak — Sweden’s only prisoner of conscience

Swedish-Eritrean journalist and writer Dawit Isaak has been held in Eritrean prison for 13 years without trial. He is the only Swedish citizen held as a prisoner of conscience.

Today is his 50th birthday. Let us hope it is the last spent in prison.

Free Dawit Isaak!

Keynes’s fundamental insight

26 October, 2014 at 22:48 | Posted in Economics | 3 Comments

The difficulty lies, not in the new ideas, but in the escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds.

John Maynard Keynes


Mark Blaug (1927-2011) did more than any other single person to establish the philosophy and methodology of economics a respected subfield within economics. His path-breaking The methodology of economics (1980) is still a landmark — and the first textbook on economic methodology yours truly had to read as student.

At last — a worthy Nobel Prize winner

26 October, 2014 at 09:41 | Posted in Varia | Comments Off on At last — a worthy Nobel Prize winner

 

In her breathtakingly simple, moving and beautiful speech at the United Nations last year, Malala Yousafzai wrote herself into history. A more fortright plaidoyer for what really can change the world – empowering knowledge and education for all – has seldom been heard. Malala is a living proof that not even the most heinous totalitarianism can defeat young people’s call for for education and justice.

Fred Lee

25 October, 2014 at 16:38 | Posted in Varia | 1 Comment

leeLast night (Oct. 23) at 11:20 PM, CDT, prominent heterodox economist, Fred Lee of the University of Missouri-Kansas City, died of cancer.  He had stopped teaching during the last spring semester and was honored at the 12th International Post Keynesian Conference held at UMKC a month ago …

Whatever one thinks of heterodox economics in general, or of the views of Fred Lee in particular, he should be respected as the person more than any other who was behind the founding of the International Conference of Associations for Pluralism in Economics (ICAPE), and also the Heterodox Economics Newsletter.  While many talked about the need for there to be an organized group pushing heterodox economics in all its varieties, Fred did more than talk and went and organized the group and its main communications outlet.  He also regularly and strongly spoke in favor of heterodox economics, the unity of which he may have exaggerated.  But his voice in advocating the superiority of heterodox economics over mainstream neoclassical economics was as strong as that of anybody that I have known.  I also note that he was the incoming President for the Association for Evolutionary Economics (AFEE), and they will now have to find a replacement.  He had earlier stepped down from his positions with ICAPE and the Heterodox Economics Newsletter.

It was both sad and moving to see Fred at the PK conference last month in Kansas City … Although he was having trouble even breathing and could barely even speak, he rose and made his comments, at the end becoming impassioned and speaking up forcefully to proclaim his most firmly held positions.  He declared that his entire career had been devoted to battling for the downtrodden, poor, and suffering around the world, “against the 1% percent!” and I know that there was not a single person in that standing room only audience who doubted him.  He openly wept after he finished with those stirring words, as those who were not already standing rose to applaud him with a standing ovation.

J. Barkley Rosser

Fred was together with Nai Pew Ong and Bob Pollin one of those who made a visit to University of California such a great experience back in the beginning of the 1980s for a young Swedish economics student. I especially remember our long and intense discussions on Sraffa and neoricardianism. I truly miss this open-minded and good-hearted heterodox economist. Rest in peace my dear old friend.

A Post Keynesian response to Piketty

25 October, 2014 at 12:55 | Posted in Economics | Comments Off on A Post Keynesian response to Piketty

o-ECON-CHART-facebookThe rejection of specific theoretical arguments does not diminish the achievements of Piketty’s work. Capital is an outstanding work, it has brought issues of wealth and income distribution to the spotlight, where heterodox economists have failed to do so. It has also put together, and made readily available, an invaluable data set, and it allows future researchers to analyse macroeconomics with a much broader time horizon, covering much of the history of capitalism rather than the last few decades. But we do suggest that the analysis of the book would have been strengthened if Piketty had had also considered a post-Keynesian instead of a neoclassical framework.

Post Keynesian Economics Study Group

How mainstream economics imperils our economies

24 October, 2014 at 09:31 | Posted in Economics | Comments Off on How mainstream economics imperils our economies


[h/t Mark Thoma]

Piketty and the elasticity of substitution

23 October, 2014 at 22:39 | Posted in Economics | 4 Comments

When “Capital in the 21st Century” was published in English earlier this year, Thomas Piketty’s book was met with rapt attention and constant conversation. The book was lauded but also faced criticism, particularly from other economists who wanted to fit Piketty’s work into the models they knew well …

whereswaldo1A particularly technical and effective critique of Piketty is from Matt Rognlie, a graduate student in economics at the Massachusetts Institute of Technology. Rognlie points out that for capital returns to be consistently higher than the overall growth of the economy—or “r > g” as framed by Piketty—an economy needs to be able to easily substitute capital such as machinery or robots for labor. In the terminology of economics this is called the elasticity of substitution between capital and labor, which needs to be greater than 1 for r to be consistently higher than g. Rognlie argues that most studies looking at this particular elasticity find that it is below 1, meaning a drop in economic growth would result in a larger drop in the rate of return and then g being larger than r. In turn, this means capital won’t earn an increasing share of income and the dynamics laid out by Piketty won’t arise …

Enter the new paper by economists Loukas Karabarbounis and Brent Neiman … Their new paper investigates how depreciation affects the measurement of labor share and the elasticity between capital and labor. Using their data set of labor shares income and a model, Karabarnounis and Neiman show that the gross labor share and the net labor share move in the same direction when the shift is caused by a technological shock—as has been the case, they argue, in recent decades. More importantly for this conversation, they point out that the gross and net elasticities are on the same side of 1 if that shock is technological. In the case of a declining labor share, this means they would both be above 1.

This means Rognlie’s point about these two elasticities being lower than 1 doesn’t hold up if capital is gaining due to a new technology that makes capital cheaper …

In short, this new paper gives credence to one of the key dynamics in Piketty’s “Capital in the 21st Century”—that the returns on capital can be higher than growth in the economy, or r > g.

Nick Bunker

To me this is only a confirmation of what I wrote earlier this autumn on the issue:

Being able to show that you can get the Piketty results using one or another of the available standard neoclassical growth models is of course — from a realist point of view — of limited value. As usual — the really interesting thing is how in accord with reality are the assumptions you make and the numerical values you put into the model specification.

Sherlock Holmes inference and econometric testing

23 October, 2014 at 15:10 | Posted in Statistics & Econometrics | Comments Off on Sherlock Holmes inference and econometric testing

Basil Rathbone as Sherlock HolmesSherlock Holmes stated that ‘It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.’ True this may be in the circumstance of crime investigation, the principle does not apply to testing. In a crime investigation one wants to know what actually happened: who did what, when and how. Testing is somewhat different.

With testing, not only what happened is interesting, but what could have happened, and what would have happened were the circumstances to repeat itself. The particular events under study are considered draws from a larger population. It is the distribution of this population one is primarily interested in, and not so much the particular realizations of that distribution. So not the particular sequence of head and tails in coin flipping is of interest, but whether that says something about a coin being biased or not. Not (only) whether inflation and unemployment went together in the sixties is interesting, but what that tells about the true trade-off between these two economic variables. In short, one wants to test.

The tested hypothesis has to come from somewhere and to base it, like Holmes, on data is valid procedure … The theory should however not be tested on the same data they were derived from. To use significance as a selection criterion in a regression equation constitutes a violation of this principle …

Consider for example time series econometrics … It may not be clear a priori which lags matter, while it is clear that some definitely do … The Box-Jenkins framework models the auto-correlation structure of a series as good as possible first, postponing inference to the next stage. In this next stage other variables or their lagged values may be related to the time series under study. While this justifies why time series uses data mining, it leaves unaddressed the issue of the true level of significance …

This is sometimes recommended in a general-to-specific approach where the most general model is estimated and insignificant variables are subsequently discarded. As superfluous variables increase the variance of estimators, omitting irrelevant variables this way may increase efficiency. Problematic is that variables were included in the first place because they were thought to be (potentially) relevant. If then for example twenty variables, believed to be potentially relevant a priori, are included, then one or more will bound to be insignificant (depending on the power, which cannot be trusted to be high). Omitting relevant variables, whether they are insignificant or not, generally biases all other estimates as well due to the well-known omitted variable bias. The data are thus used both to specify the model and test the model; this is the problem of estimation. Without further notice this double use of the data is bound to be misleading if not incorrect. The tautological nature of this procedure is apparent; as significance is the selection criterion it is not very surprising selected variables are significant.

D. A. Hollanders Five methodological fallacies in applied econometrics

Econometric disillusionment

22 October, 2014 at 11:16 | Posted in Statistics & Econometrics | 1 Comment

reality header3

Because I was there when the economics department of my university got an IBM 360, I was very much caught up in the excitement of combining powerful computers with economic research. Unfortunately, I lost interest in econometrics almost as soon as I understood how it was done. My thinking went through four stages:

1.Holy shit! Do you see what you can do with a computer’s help.
2.Learning computer modeling puts you in a small class where only other members of the caste can truly understand you. This opens up huge avenues for fraud:
3.The main reason to learn stats is to prevent someone else from committing fraud against you.
4.More and more people will gain access to the power of statistical analysis. When that happens, the stratification of importance within the profession should be a matter of who asks the best questions.

Disillusionment began to set in. I began to suspect that all the really interesting economic questions were FAR beyond the ability to reduce them to mathematical formulas. Watching computers being applied to other pursuits than academic economic investigations over time only confirmed those suspicions.

1.Precision manufacture is an obvious application for computing. And for many applications, this worked magnificently. Any design that combined straight line and circles could be easily described for computerized manufacture. Unfortunately, the really interesting design problems can NOT be reduced to formulas. A car’s fender, for example, can not be describe using formulas—it can only be described by specifying an assemblage of multiple points. If math formulas cannot describe something as common and uncomplicated as a car fender, how can it hope to describe human behavior?
2.When people started using computers for animation, it soon became apparent that human motion was almost impossible to model correctly. After a great deal of effort, the animators eventually put tracing balls on real humans and recorded that motion before transferring it to the the animated character. Formulas failed to describe simple human behavior—like a toddler trying to walk.

Lately, I have discovered a Swedish economist who did NOT give up econometrics merely because it sounded so impossible. In fact, he still teaches the stuff. But for the rest of us, he systematically destroys the pretensions of those who think they can describe human behavior with some basic Formulas.

Jonathan Larson

Wonder who that Swedish guy is …

Post-Keynesian economics — an introduction

22 October, 2014 at 00:02 | Posted in Economics | 1 Comment

 

[h/t Jan Milch]

Data mining and the meaning of the Econometric Scripture

20 October, 2014 at 21:19 | Posted in Statistics & Econometrics | 1 Comment

Some variants of ‘data mining’ can be classified as the greatest of the basement sins, but other variants of ‘data mining’ can be viewed as important ingredients in data analysis. Unfortunately, these two variants usually are not mutually exclusive and so frequently conflict in the sense that to gain the benefits of the latter, one runs the risk of incurring the costs of the former.

mining-e1379773721738Hoover and Perez (2000, p. 196) offer a general definition of data mining as referring to “a broad class of activities that have in common a search over different ways to process or package data statistically or econometrically with the purpose of making the final presentation meet certain design criteria.” Two markedly different views of data mining lie within the scope of this general definition. One view of ‘data mining’ is that it refers to experimenting with (or ‘fishing through’) the data to produce a specification … The problem with this, and why it is viewed as a sin, is that such a procedure is almost guaranteed to produce a specification tailored to the peculiarities of that particular data set, and consequently will be misleading in terms of what it says about the underlying process generating the data. Furthermore, traditional testing procedures used to ‘sanctify’ the specification are no longer legitimate, because these data, since they have been used to generate the specification, cannot be judged impartial if used to test that specification …

An alternative view of ‘data mining’ is that it refers to experimenting with (or ‘fishing through’) the data to discover empirical regularities that can inform economic theory … Hand et al (2000) describe data mining as the process of seeking interesting or valuable information in large data sets. Its greatest virtue is that it can uncover empirical regularities that point to errors/omissions in theoretical specifications …

In summary, this second type of ‘data mining’ identifies regularities in or characteristics of the data that should be accounted for and understood in the context of the underlying theory. This may suggest the need to rethink the theory behind one’s model, resulting in a new specification founded on a more broad-based understanding. This is to be distinguished from a new specification created by mechanically remolding the old specification to fit the data; this would risk incurring the costs described earlier when discussing the first variant of ‘data mining.’

The issue here is how should the model specification be chosen? As usual, Leamer (1996, p. 189) has an amusing view: “As you wander through the thicket of models, you may come to question the meaning of the Econometric Scripture that presumes the model is given to you at birth by a wise and beneficent Holy Spirit.”

In practice, model specifications come from both theory and data, and given the absence of Leamer’s Holy Spirit, properly so.

Peter Kennedy

Microfounded DSGE models — a total waste of time!

20 October, 2014 at 15:21 | Posted in Economics | Comments Off on Microfounded DSGE models — a total waste of time!

In conclusion, one can say that the sympathy that some of the traditional and Post-Keynesian authors show towards DSGE models is rather hard to understand. Even before the recent financial and economic crisis put some weaknesses of the model – such as the impossibility of generating asset price bubbles or the lack of inclusion of financial sector issues – into the spotlight and brought them even to the attention of mainstream media, the models’ inner working were highly questionable from the very beginning. While one can understand that some of the elements in DSGE models seem to appeal to Keynesians at first sight, after closer examination, these models are in fundamental contradiction to Post-Keynesian and even traditional Keynesian thinking. The DSGE model is a model in which output is determined in the labour market as in New Classical models and in which aggregate demand plays only a very secondary role, even in the short run.

12-02-03-ostwärts-dullien-01In addition, given the fundamental philosophical problems presented for the use of DSGE models for policy simulation, namely the fact that a number of parameters used have completely implausible magnitudes and that the degree of freedom for different parameters is so large that DSGE models with fundamentally different parametrization (and therefore different policy conclusions) equally well produce time series which fit the real-world data, it is also very hard to understand why DSGE models have reached such a prominence in economic science in general.

Sebastian Dullien

Neither New Classical nor “New Keynesian” microfounded DSGE macro models have helped us foresee, understand or craft solutions to the problems of today’s economies. But still most young academic macroeconomists want to work with DSGE models. After reading Dullien’s article, that certainly should be a very worrying confirmation of economics — at least from the point of view of realism and relevance — becoming more and more a waste of time. Why do these young bright guys waste their time and efforts? Besides aspirations of being published, I think maybe Frank Hahn gave the truest answer back in 2005, when interviewed on the occasion of his 80th birthday, he confessed that some economic assumptions didn’t really say anything about “what happens in the world,” but still had to be considered very good “because it allows us to get on this job.”

Watch out for econometric sinning in the basement!

19 October, 2014 at 17:00 | Posted in Statistics & Econometrics | 2 Comments

Brad DeLong wonders why Cliff Asness is clinging to a theoretical model that has clearly been rejected by the data …

Data-Mining
 
There’s a version of this in econometrics, i.e. you know the model is correct, you are just having trouble finding evidence for it. It goes as follows. You are testing a theory you came up with, but the data are uncooperative and say you are wrong. But instead of accepting that, you tell yourself “My theory is right, I just haven’t found the right econometric specification yet. I need to add variables, remove variables, take a log, add an interaction, square a term, do a different correction for misspecification, try a different sample period, etc., etc., etc.” Then, after finally digging out that one specification of the econometric model that confirms your hypothesis, you declare victory, write it up, and send it off (somehow never mentioning the intense specification mining that produced the result).

Too much econometric work proceeds along these lines. Not quite this blatantly, but that is, in effect, what happens in too many cases. I think it is often best to think of econometric results as the best case the researcher could make for a particular theory rather than a true test of the model.

Mark Thoma

Mark touches the spot — and for the sake of balancing the overly rosy picture of econometric achievements given in the usual econometrics textbooks today, it may also be interesting to see how Trygve Haavelmo, with the completion (in 1958) of the twenty-fifth volume of Econometrica, assessed the the role of econometrics in the advancement of economics. Although mainly positive of the “repair work” and “clearing-up work” done, Haavelmo also found some grounds for despair:

We have found certain general principles which would seem to make good sense. Essentially, these principles are based on the reasonable idea that, if an economic model is in fact “correct” or “true,” we can say something a priori about the way in which the data emerging from it must behave. We can say something, a priori, about whether it is theoretically possible to estimate the parameters involved. And we can decide, a priori, what the proper estimation procedure should be … But the concrete results of these efforts have often been a seemingly lower degree of accuracy of the would-be economic laws (i.e., larger residuals), or coefficients that seem a priori less reasonable than those obtained by using cruder or clearly inconsistent methods.

Haavelmo-intro-2-125397_630x210There is the possibility that the more stringent methods we have been striving to develop have actually opened our eyes to recognize a plain fact: viz., that the “laws” of economics are not very accurate in the sense of a close fit, and that we have been living in a dream-world of large but somewhat superficial or spurious correlations.

And as the quote below shows, Frisch also shared some of Haavelmo’s — and Keynes’s — doubts on the applicability of econometrics:

sp9997db.hovedspalteI have personally always been skeptical of the possibility of making macroeconomic predictions about the development that will follow on the basis of given initial conditions … I have believed that the analytical work will give higher yields – now and in the near future – if they become applied in macroeconomic decision models where the line of thought is the following: “If this or that policy is made, and these conditions are met in the period under consideration, probably a tendency to go in this or that direction is created”.

Ragnar Frisch

Slippery slope arguments

19 October, 2014 at 16:10 | Posted in Theory of Science & Methodology | Comments Off on Slippery slope arguments

 

Germany is turning EU recovery into recession

19 October, 2014 at 14:25 | Posted in Economics, Politics & Society | Comments Off on Germany is turning EU recovery into recession

beppe-grillo.-satira-300x431Beppe Grillo, the comedian-turned-rebel leader of Italian politics, must have laughed heartily. No sooner had he announced to supporters that the euro was “a total disaster” than the currency union was driven to the brink of catastrophe once again.

Grillo launched a campaign in Rome last weekend for a 1 million-strong petition against the euro, saying: “We have to leave the euro as soon as possible and defend the sovereignty of the Italian people from the European Central Bank.”

Hours later markets slumped on news that the 18-member eurozone was probably heading for recession. And there was worse to come. Greece, the trigger for the 2010 euro crisis, saw its borrowing rates soar, putting it back on the “at-risk register”. Investors, already digesting reports of slowing global growth, were also spooked by reports that a row in Brussels over spending caps on France and Italy had turned nasty …

In the wake of the 2008 global financial crisis, voters backed austerity and the euro in expectation of a debt-reducing recovery. But as many Keynesian economists warned, this has proved impossible. More than five years later, there are now plenty of voters willing to call time on the experiment, Grillo among them. And there seems to be no end to austerity-driven low growth in sight. The increasingly hard line taken by Berlin over the need for further reforms in debtor nations such as Greece and Italy – by which it means wage cuts – has worked to turn a recovery into a near recession.

merkelphoneAngela Merkel and her finance minister Wolfgang Schäuble are shaping up to fight all comers over maintaining the 3% budget deficit limit and already-agreed austerity measures.

Even if France and Italy find a fudge to bypass the deficit rule, they will be prevented from embarking on the Marshall Plan each believes is needed to turn their economies around. Hollande wants a EU-wide €300bn stimulus to boost investment and jobs – something that is unlikely to ever get off the ground …

So a rally is likely to be short-lived. Volatility is here to stay. The only answer comes from central bankers, who propose pumping more funds into the financial system to bring down the cost of credit and encourage lending and, hopefully, sustainable growth …

Andy Haldane, the chief economist at the Bank of England, said he was gloomier now than at any time this year. He expects interest rates to stay low until at least next summer.

It’s not a plan with much oomph. Most economists believe the impact of central bank money is waning. Yet without growth and the hope of well-paid jobs for young people, parents across the EU who previously feared for their savings following a euro exit appear ready to consider the potential benefits of a break-up. There is a Grillo in almost every eurozone nation. Now that would bring real volatility.

The Observer

What’s behind rising wealth inequality?

19 October, 2014 at 14:00 | Posted in Economics | 1 Comment

The Initiative on Global Markets at the University of Chicago yesterday released a survey of a panel of highly regarded economists asking about rising wealth inequality. Specifically, IGM asked if the difference between the after-tax rate of return on capital and the growth rate of the overall economy was the “most powerful force pushing towards greater wealth inequality in the United States since the 1970s.”

The vast majority of the economists disagreed with the statement. As would economist Thomas Piketty, the originator of the now famous r > g inequality. He explicitly states that rising inequality in the United States is about rising labor income at the very top of the income distribution. As Emmanuel Saez, an economist at the University of California, Berkeley and a frequent Piketty collaborator, points out r > g is a prediction about the future.

But if wealth inequality has risen in the United States over the past four decades, what has been behind the rise? A new paper by Saez and the London School of Economics’ Gabriel Zucman provides an answer: the calcification of income inequality into wealth inequality …

101514-new-saez-data

 

Nick Bunker

Lies that economics is built on

18 October, 2014 at 10:38 | Posted in Statistics & Econometrics | 2 Comments

Peter Dorman is one of those rare economists that it is always a pleasure to read. Here his critical eye is focused on economists’ infatuation with homogeneity and averages:

You may feel a gnawing discomfort with the way economists use statistical techniques. Ostensibly they focus on the difference between people, countries or whatever the units of observation happen to be, but they nevertheless seem to treat the population of cases as interchangeable—as homogenous on some fundamental level. As if people were replicants.

You are right, and this brief talk is about why and how you’re right, and what this implies for the questions people bring to statistical analysis and the methods they use.

Our point of departure will be a simple multiple regression model of the form

y = β0 + β1 x1 + β2 x2 + …. + ε

where y is an outcome variable, x1 is an explanatory variable of interest, the other x’s are control variables, the β’s are coefficients on these variables (or a constant term, in the case of β0), and ε is a vector of residuals. We could apply the same analysis to more complex functional forms, and we would see the same things, so let’s stay simple.

notes7-2What question does this model answer? It tells us the average effect that variations in x1 have on the outcome y, controlling for the effects of other explanatory variables. Repeat: it’s the average effect of x1 on y.

This model is applied to a sample of observations. What is assumed to be the same for these observations? (1) The outcome variable y is meaningful for all of them. (2) The list of potential explanatory factors, the x’s, is the same for all. (3) The effects these factors have on the outcome, the β’s, are the same for all. (4) The proper functional form that best explains the outcome is the same for all. In these four respects all units of observation are regarded as essentially the same.

Now what is permitted to differ across these observations? Simply the values of the x’s and therefore the values of y and ε. That’s it.

Thus measures of the difference between individual people or other objects of study are purchased at the cost of immense assumptions of sameness. It is these assumptions that both reflect and justify the search for average effects …

In the end, statistical analysis is about imposing a common structure on observations in order to understand differentiation. Any structure requires assuming some kinds of sameness, but some approaches make much more sweeping assumptions than others. An unfortunate symbiosis has arisen in economics between statistical methods that excessively rule out diversity and statistical questions that center on average (non-diverse) effects. This is damaging in many contexts, including hypothesis testing, program evaluation, forecasting—you name it …

The first step toward recovery is admitting you have a problem. Every statistical analyst should come clean about what assumptions of homogeneity are being made, in light of their plausibility and the opportunities that exist for relaxing them.

Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality. But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive, not necessarily epistemologically inaccessible – that were not considered for the model.

Real world social systems are not governed by stable causal mechanisms or capacities. If economic regularities obtain they — as a rule — do it only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes them rather useless.

Remember that a model is not the truth. It is a lie to help you get your point across. And in the case of modeling economic risk, your model is a lie about others, who are probably lying themselves. And what’s worse than a simple lie? A complicated lie.

Sam L. Savage The Flaw of Averages

Does Janet Yellen have a liberal bias?

18 October, 2014 at 08:34 | Posted in Politics & Society | Comments Off on Does Janet Yellen have a liberal bias?

 

Next Page »

Blog at WordPress.com.
Entries and comments feeds.