Go Canada Go!

29 October, 2014 at 22:22 | Posted in Varia | Leave a comment


Macroeconomic aspirations

29 October, 2014 at 17:00 | Posted in Economics | 5 Comments

Oxford macroeconomist Simon Wren-Lewis has a post up on his blog on the use of labels in macroeconomics:

EPAreadlabelLabels are fun, and get attention. They can be a useful shorthand to capture an idea, or related set of ideas … Here are a couple of bold assertions, which I think I believe, and which I will try to justify. First, in academic research terms there is only one meaningful division, between mainstream and heterodox … Second, in macroeconomic policy terms I think there is only one meaningful significant division, between mainstream and anti-Keynesians …

So what do I mean by a meaningful division in academic research terms? I mean speaking a different language. Thanks to the microfoundations revolution in macro, mainstream macroeconomists speak the same language. I can go to a seminar that involves an RBC model with flexible prices and no involuntary unemployment and still contribute and possibly learn something.

Wren-Lewis seems to be überjoyed by the fact that using the same language as real business cycles macroeconomists he can “possibly learn something” from them.

Hmm …

Wonder what …

I’m not sure Wren-Lewis uses the same “language” as James Tobin, but he’s definitely worth listening to:

They try to explain business cycles solely as problems of information, such as asymmetries and imperfections in the information agents have. Those assumptions are just as arbitrary as the institutional rigidities and inertia they find objectionable in other theories of business fluctuations … I try to point out how incapable the new equilibrium business cycles models are of explaining the most obvious observed facts of cyclical fluctuations … I don’t think that models so far from realistic description should be taken seriously as a guide to policy … I don’t think that there is a way to write down any model which at one hand respects the possible diversity of agents in taste, circumstances, and so on, and at the other hand also grounds behavior rigorously in utility maximization and which has any substantive content to it.

Arjo Klamer, The New Classical Mcroeconomics: Conversations with the New Classical Economists and their  Opponents,Wheatsheaf Books, 1984

And contrary to Wren-Lewis I don’t think the fact that “thanks to the microfoundations revolution in macro, mainstream macroeconomists speak the same language,” takes us very far. Far better than having a common “language” is to have a well-founded, realist and relevant theory:

Microfoundations for macroeconomics are fine in principle—not indispensable, but useful. The problem is that what passes for microfoundations in the universe of orthodox macro is crap …

realityIt’s nothing more than robotic imitation of teaching exercises to improve math skills, without any consideration for such mundane matters as empirical verisimilitude. I will mention three crushing faults, each sufficient by itself to blow a wide hole in a supposedly useful model …

It is rife with anomalies (see “behavioral economics”), and, most important, it is oblivious to the last several decades of work in psychology, evolutionary biology, neuropsychology, organization theory—all the disciplines where people study behavior in a scientific way …

There are no interaction effects to generate multiple equilibria in the microfoundations macro theorists use. Every individual, firm and product is an isolated atom, floating uninterrupted through space until it bumps into another such atom in the marketplace. Social psychology, ecology, nonconvex production and consumption spaces?  Forget about it …

Microfoundations means general equilibrium theory, but the flavor it uses is from the mid-1950s. The Sonnenschein-Debreu-Mantel demonstration (update to the 1970s)  that initial conditions and out-of-equilibrium trades alter the equilibrium itself has turned GET upside down.

Notice that I haven’t mentioned the standard heterodox criticisms of representative agents and ergodicity. You can add those if you want …

Like I said, their microfoundations are crap.

Peter Dorman

Macroeconomists have to have bigger aspirations than speaking the same “language.” Rigorous models lacking relevance is not to be taken seriously. Truly great macroeconomists aspire to explain and understand the fundamentals of modern economies. As did e. g. John Maynard Keynes and Michal Kalecki.

Dawit Isaak — Sweden’s only prisoner of conscience

27 October, 2014 at 07:53 | Posted in Economics | Leave a comment

Swedish-Eritrean journalist and writer Dawit Isaak has been held in Eritrean prison for 13 years without trial. He is the only Swedish citizen held as a prisoner of conscience.

Today is his 50th birthday. Let us hope it is the last spent in prison.

Free Dawit Isaak!

Keynes’s fundamental insight

26 October, 2014 at 22:48 | Posted in Economics | 3 Comments

The difficulty lies, not in the new ideas, but in the escaping from the old ones, which ramify, for those brought up as most of us have been, into every corner of our minds.

John Maynard Keynes

Mark Blaug (1927-2011) did more than any other single person to establish the philosophy and methodology of economics a respected subfield within economics. His path-breaking The methodology of economics (1980) is still a landmark — and the first textbook on economic methodology yours truly had to read as student.

At last — a worthy Nobel Prize winner

26 October, 2014 at 09:41 | Posted in Varia | Leave a comment


In her breathtakingly simple, moving and beautiful speech at the United Nations last year, Malala Yousafzai wrote herself into history. A more fortright plaidoyer for what really can change the world – empowering knowledge and education for all – has seldom been heard. Malala is a living proof that not even the most heinous totalitarianism can defeat young people’s call for for education and justice.

Fred Lee

25 October, 2014 at 16:38 | Posted in Varia | 1 Comment

leeLast night (Oct. 23) at 11:20 PM, CDT, prominent heterodox economist, Fred Lee of the University of Missouri-Kansas City, died of cancer.  He had stopped teaching during the last spring semester and was honored at the 12th International Post Keynesian Conference held at UMKC a month ago …

Whatever one thinks of heterodox economics in general, or of the views of Fred Lee in particular, he should be respected as the person more than any other who was behind the founding of the International Conference of Associations for Pluralism in Economics (ICAPE), and also the Heterodox Economics Newsletter.  While many talked about the need for there to be an organized group pushing heterodox economics in all its varieties, Fred did more than talk and went and organized the group and its main communications outlet.  He also regularly and strongly spoke in favor of heterodox economics, the unity of which he may have exaggerated.  But his voice in advocating the superiority of heterodox economics over mainstream neoclassical economics was as strong as that of anybody that I have known.  I also note that he was the incoming President for the Association for Evolutionary Economics (AFEE), and they will now have to find a replacement.  He had earlier stepped down from his positions with ICAPE and the Heterodox Economics Newsletter.

It was both sad and moving to see Fred at the PK conference last month in Kansas City … Although he was having trouble even breathing and could barely even speak, he rose and made his comments, at the end becoming impassioned and speaking up forcefully to proclaim his most firmly held positions.  He declared that his entire career had been devoted to battling for the downtrodden, poor, and suffering around the world, “against the 1% percent!” and I know that there was not a single person in that standing room only audience who doubted him.  He openly wept after he finished with those stirring words, as those who were not already standing rose to applaud him with a standing ovation.

J. Barkley Rosser

Fred was together with Nai Pew Ong and Bob Pollin one of those who made a visit to University of California such a great experience back in the beginning of the 1980s for a young Swedish economics student. I especially remember our long and intense discussions on Sraffa and neoricardianism. I truly miss this open-minded and good-hearted heterodox economist. Rest in peace my dear old friend.

A Post Keynesian response to Piketty

25 October, 2014 at 12:55 | Posted in Economics | Leave a comment

o-ECON-CHART-facebookThe rejection of specific theoretical arguments does not diminish the achievements of Piketty’s work. Capital is an outstanding work, it has brought issues of wealth and income distribution to the spotlight, where heterodox economists have failed to do so. It has also put together, and made readily available, an invaluable data set, and it allows future researchers to analyse macroeconomics with a much broader time horizon, covering much of the history of capitalism rather than the last few decades. But we do suggest that the analysis of the book would have been strengthened if Piketty had had also considered a post-Keynesian instead of a neoclassical framework.

Post Keynesian Economics Study Group

How mainstream economics imperils our economies

24 October, 2014 at 09:31 | Posted in Economics | Leave a comment

[h/t Mark Thoma]

Piketty and the elasticity of substitution

23 October, 2014 at 22:39 | Posted in Economics | 4 Comments

When “Capital in the 21st Century” was published in English earlier this year, Thomas Piketty’s book was met with rapt attention and constant conversation. The book was lauded but also faced criticism, particularly from other economists who wanted to fit Piketty’s work into the models they knew well …

whereswaldo1A particularly technical and effective critique of Piketty is from Matt Rognlie, a graduate student in economics at the Massachusetts Institute of Technology. Rognlie points out that for capital returns to be consistently higher than the overall growth of the economy—or “r > g” as framed by Piketty—an economy needs to be able to easily substitute capital such as machinery or robots for labor. In the terminology of economics this is called the elasticity of substitution between capital and labor, which needs to be greater than 1 for r to be consistently higher than g. Rognlie argues that most studies looking at this particular elasticity find that it is below 1, meaning a drop in economic growth would result in a larger drop in the rate of return and then g being larger than r. In turn, this means capital won’t earn an increasing share of income and the dynamics laid out by Piketty won’t arise …

Enter the new paper by economists Loukas Karabarbounis and Brent Neiman … Their new paper investigates how depreciation affects the measurement of labor share and the elasticity between capital and labor. Using their data set of labor shares income and a model, Karabarnounis and Neiman show that the gross labor share and the net labor share move in the same direction when the shift is caused by a technological shock—as has been the case, they argue, in recent decades. More importantly for this conversation, they point out that the gross and net elasticities are on the same side of 1 if that shock is technological. In the case of a declining labor share, this means they would both be above 1.

This means Rognlie’s point about these two elasticities being lower than 1 doesn’t hold up if capital is gaining due to a new technology that makes capital cheaper …

In short, this new paper gives credence to one of the key dynamics in Piketty’s “Capital in the 21st Century”—that the returns on capital can be higher than growth in the economy, or r > g.

Nick Bunker

To me this is only a confirmation of what I wrote earlier this autumn on the issue:

Being able to show that you can get the Piketty results using one or another of the available standard neoclassical growth models is of course — from a realist point of view — of limited value. As usual — the really interesting thing is how in accord with reality are the assumptions you make and the numerical values you put into the model specification.

Sherlock Holmes inference and econometric testing

23 October, 2014 at 15:10 | Posted in Statistics & Econometrics | Leave a comment

Basil Rathbone as Sherlock HolmesSherlock Holmes stated that ‘It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.’ True this may be in the circumstance of crime investigation, the principle does not apply to testing. In a crime investigation one wants to know what actually happened: who did what, when and how. Testing is somewhat different.

With testing, not only what happened is interesting, but what could have happened, and what would have happened were the circumstances to repeat itself. The particular events under study are considered draws from a larger population. It is the distribution of this population one is primarily interested in, and not so much the particular realizations of that distribution. So not the particular sequence of head and tails in coin flipping is of interest, but whether that says something about a coin being biased or not. Not (only) whether inflation and unemployment went together in the sixties is interesting, but what that tells about the true trade-off between these two economic variables. In short, one wants to test.

The tested hypothesis has to come from somewhere and to base it, like Holmes, on data is valid procedure … The theory should however not be tested on the same data they were derived from. To use significance as a selection criterion in a regression equation constitutes a violation of this principle …

Consider for example time series econometrics … It may not be clear a priori which lags matter, while it is clear that some definitely do … The Box-Jenkins framework models the auto-correlation structure of a series as good as possible first, postponing inference to the next stage. In this next stage other variables or their lagged values may be related to the time series under study. While this justifies why time series uses data mining, it leaves unaddressed the issue of the true level of significance …

This is sometimes recommended in a general-to-specific approach where the most general model is estimated and insignificant variables are subsequently discarded. As superfluous variables increase the variance of estimators, omitting irrelevant variables this way may increase efficiency. Problematic is that variables were included in the first place because they were thought to be (potentially) relevant. If then for example twenty variables, believed to be potentially relevant a priori, are included, then one or more will bound to be insignificant (depending on the power, which cannot be trusted to be high). Omitting relevant variables, whether they are insignificant or not, generally biases all other estimates as well due to the well-known omitted variable bias. The data are thus used both to specify the model and test the model; this is the problem of estimation. Without further notice this double use of the data is bound to be misleading if not incorrect. The tautological nature of this procedure is apparent; as significance is the selection criterion it is not very surprising selected variables are significant.

D. A. Hollanders Five methodological fallacies in applied econometrics

Next Page »

Blog at WordPress.com.
Entries and comments feeds.