Serially-correlated shocks and DSGE models (wonkish)

30 September, 2014 at 15:02 | Posted in Economics | 1 Comment

Now it is “dynamic stochastic general equilibrium” (DSGE) models inspired by the Lucas critique that have failed to predict or even explain the Great Recession of 2007–2009. More precisely, the implicit “explanations” based on these models are that the recession, including the millions of net jobs lost, was primarily due to large negative shocks to both technology and willingness to work … So can the reputation of modern macroeconomics be rehabilitated by simply modifying DSGE models to include a few more realistic shocks? …

A simple example helps illustrate for the uninitiated just how DSGE models work and why it should come as little surprise that they are largely inadequate for the task of explaining the Great Recession.

For this simple DSGE model, consider the following technical assumptions: i) an infinitely-lived representative agent with rational expectations and additive utility in current and discounted future log consumption and leisure; ii) a Cobb-Douglas aggregate production function with labor-augmenting technology; iii) capital accumulation with a fixed depreciation rate; and iv) a stochastic process for exogenous technology shocks …

wrong-tool-by-jerome-awIt is worth making two basic points about the setup. First, by construction, technology shocks are the only underlying source of fluctuations in this simple model. Thus, if we were to assume that U.S. real GDP was the literal outcome of this model, we would be assuming a priori that fluctuations in real GDP were ultimately due to technology. When faced with the Great Recession, this model would have no choice but to imply that technology shocks were somehow to blame. Second, despite the underlying role of technology, the observed fluctuations in real GDP can be divided into those that directly reflect the behavior of the exogenous shocks and those that reflect the endogenous capital accumulation in response to these shocks.

To be more precise about these two points, it is necessary to assume a particular process for the exogenous technology shocks. In this case, let’s assume technology follows a random walk with drift [and assuming a 100% depreciation rate of capital]…

So, with this simple DSGE model and for typical measures of the capital share, we have the implication that output growth follows an AR(1) process with an AR coefficient of about one third. This is notable given that such a time-series model does reasonably well as a parsimonious description of quarterly real GDP dynamics for the U.S. economy …

However, the rather absurd assumption of a 100% depreciation rate at the quarterly horizon would surely still have prompted a sharp question or two in a University of Chicago seminar back in the days. So, with this in mind, what happens if we consider the more general case?

binary-options-bunny-tophat-magicianUnfortunate-ly, for more realistic depreciation rates, we cannot solve the model analytically. Instead, taking a log-linearization around steady state, we can use standard methods to solve for output growth … This simple DSGE model is able to mimic the apparent AR(1) dynamics in real GDP growth. But it does so by assuming the exogenous technology shocks also follow an AR(1) process with an AR coefficient that happens to be the same as the estimated AR coefficient for output growth. Thus, the magic trick has been revealed: a rabbit was stuffed into the hat and then a rabbit jumped out of the hat …

Despite their increasing sophistication, DSGE models share one key thing in common with their RBC predecessors. After more than two decades of earnest promises to do better in the “future directions” sections of academic papers, they still have those serially-correlated shocks. Thus, the models now “explain” variables like real GDP, inflation, and interest rates as the outcome of more than just serially-correlated technology shocks. They also consider serially-correlated preference shocks and serially-correlated policy shocks …

James Morley

[h/t Brad DeLong & Merijn Knibbe]

Neoclassical economics and neoliberalism — two varieties of market fundamentalism

30 September, 2014 at 13:06 | Posted in Economics | Leave a comment

Oxford professor Simon Wren-Lewis had a post up some time ago commenting on traction gaining “attacks on mainstream economics”:

One frequent accusation … often repeated by heterodox economists, is that mainstream economics and neoliberal ideas are inextricably linked. Of course economics is used to support neoliberalism. Yet I find mainstream economics full of ideas and analysis that permits a wide ranging and deep critique of these same positions. The idea that the two live and die together is just silly.

Hmmm …

Silly? Maybe Wren-Lewis and other economists who want to enlighten themselves on the subject should take a look at this video:


Or maybe read this essay, where yours truly try to further analyze — much inspired by the works of Amartya Sen — what kind of philosophical-ideological-economic doctrine neoliberalism is, and why it so often comes natural for mainstream neoclassical economists to embrace neoliberal ideals.

den-dystra-vetenskapenOr — if you know some Swedish — you could take a look in this book on the connection between the dismal science and neoliberalism (sorry for shameless self-promotion).

NAIRU — a failed metaphor legitimizing austerity policies

29 September, 2014 at 13:09 | Posted in Economics, Politics & Society | 1 Comment

In our extended NAIRU model, labor productivity growth is included in the wage bargaining process … The logical consequence of this broadening of the theoretical canvas has been that the NAIRU becomes endogenous itself and ceases to be an attractor — Milton Friedman’s natural, stable and timeless equilibrium point from which the system cannot permanently deviate. In our model, a deviation from the initial equilibrium affects not only wages and prices (keeping the rest of the system unchanged) but also demand, technology, workers’ motivation, and work intensity; as a result, productivity growth and ultimately equilibrium unemployment will change. There is in other words, nothing natural or inescapable about equilibrium unemployment, as is Friedman’s presumption, following Wicksell; rather, the NAIRU is a social construct, fluctuating in response to fiscal and monetary policies and labor market interventions. Its ephemeral (rather than structural) nature may explain why the best economists working on the NAIRU have persistently failed to agree on how high the NAIRU actually is and how to estimate it.

Servaas Storm & C. W. M. Naastepad

jobbluffenNAIRU has been the subject of much heated discussion and debate in Sweden lately, after SVT, the Swedish national public TV broadcaster, aired a documentary on NAIRU and the fact that many politicians  — and economists — subscribe to the NAIRU story and its policy implication that attempts to promote full employment is doomed to fail, since governments and central banks can’t push unemployment below the critical NAIRU threshold without causing harmful runaway inflation.

One of  the main problems with NAIRU is that it essentially  is a timeless long-run equilibrium attractor to which actual unemployment (allegedly) has to adjust. But if that equilibrium is itself changing — and in ways that depend on the process of getting to the equilibrium —  well, then we can’t really be sure what that equlibrium will be without contextualizing unemployment in real historical time. And when we do, we will — as highlighted by Storm and Naastepad — see how seriously wrong we go if we omit demand from the analysis. Demand  policy has long-run effects and matters also for structural unemployment — and governments and central banks can’t just look the other way and legitimize their passivity re unemploy-ment by refering to NAIRU.

NAIRU does not hold water simply because it does not exist — and to base economic policy on such a weak theoretical and empirical construct is nothing short of writing out a prescription for self-inflicted economic havoc.

New Keynesianism — a macroeconomic cul-de-sac

28 September, 2014 at 15:55 | Posted in Economics | Leave a comment

Macroeconomic models may be an informative tool for research. But if practitioners of “New Keynesian” macroeconomics do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of “New Keynesian” macroeconomics. So far, I cannot really see that it has yielded very much in terms of realistic and relevant economic knowledge.

kKeynes basically argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Dynamic stochastic general euilibrium (DSGE) macroeconomists – including “New Keynesians” – has drawn the conclusion that the problem with unstable relations is to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of post (e. g. here and here), this, however, is a dead end.

Here we are getting close to the heart of darkness in “New Keynesian” macroeconomics. Where “New Keynesian” economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they have to put a blind eye on the emergent properties that characterize all open social systems – including the economic system. The interaction between animal spirits, trust, confidence, institutions etc., cannot be deduced or reduced to a question answerable on the individual level. Macroeconomic structures and phenomena have to be analyzed also on their own terms. And although one may easily agree with e.g. Paul Krugman’s emphasis on simple models, the simplifications used may have to be simplifications adequate for macroeconomics and not those adequate for microeconomics.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, even these microfoundations aren’t immutable. The “deep parameters” of “New Keynesian” DSGE models– “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

So I cannot concur with Paul Krugman, Mike Woodford, Greg Mankiw and other sorta-kinda “New Keynesians” when they more or less try to reduce Keynesian economics to “intertemporal maximization modified with sticky prices and a few other deviations”. ” As John Quiggin so aptly writes:

If there is one thing that distinguished Keynes’ economic analysis from that of his predecessors, it was his rejection of the idea of a unique full employment equilibrium to which a market economy will automatically return when it experiences a shock. Keynes argued that an economy could shift from a full-employment equilibrium to a persistent slump as the result of the interaction between objective macroeconomic variables and the subjective ‘animal spirits’ of investors and other decision-makers. It is this perspective that has been lost in the absorption of New Keynesian macro into the DSGE framework.

Reforming economics curriculum

27 September, 2014 at 20:32 | Posted in Economics | Leave a comment

When the global economy crashed in 2008, the list of culprits was long, including dozy regulators, greedy bankers and feckless subprime borrowers. Now the dismal science itself is in the dock, with much soul-searching over why economists failed to predict the financial crisis. One of the outcomes of this debate is that economics students are demanding the reform of a curriculum they think sustains a selfish strain of capitalism and is dominated by abstract mathematics. It looks like the students will get their way. A new curriculum, designed at the University of Oxford, is being tried out. This is good news. …

heilThe typical economics course starts with the study of how rational agents interact in frictionless markets, producing an outcome that is best for everyone. Only later does it cover those wrinkles and perversities that characterise real economic behaviour, such as anti-competitive practices or unstable financial markets. As students advance, there is a growing bias towards mathematical elegance. When the uglier real world intrudes, it only prompts the question: this is all very well in practice but how does it work in theory? …

Fortunately, the steps needed to bring economics teaching into the real world do not require the invention of anything new or exotic. The curriculum should embrace economic history and pay more attention to unorthodox thinkers such as Joseph Schumpeter, Friedrich Hayek and – yes – even Karl Marx. Faculties need to restore links with other fields such as psychology and anthropology, whose insights can explain phenomena that economics cannot. Economics professors should make the study of imperfect competition – and of how people act in conditions of uncertainty – the starting point of courses, not an afterthought. …

Economics should not be taught as if it were about the discovery of timeless laws. Those who champion the discipline must remember that, at its core, it is about human behaviour, with all the messiness and disorder that this implies.

Financial Times

Borel’s law and the infinite monkey theorem (wonkish)

27 September, 2014 at 10:36 | Posted in Statistics & Econometrics | 2 Comments

Back in 1943, eminent French mathematician Émile Borel published a book titled Les probabilités et la vie, in which he introduced what has been called Borel’s law : “Events with a sufficiently small probability never occur.”

Borel’s law has also been called the infinite monkey theorem since Borel illustrated his thinking using the classic example with monkeys randomly hitting the keys of a typewriter and by chance producing the complete works of Shakespeare:

Such is the sort of event which, though its impossibility may not be rationally demonstrable, is, however, so unlikely that no sensible person will hesitate to declare it actually impossible. If someone affirms having observed such an event we would be sure that he is deceiving us or has himself been the victim of fraud.

034_infinite_monkey_theorem

Wikipedia gives the historical background and a proof of the theorem:

Variants of the theorem include multiple and even infinitely many typists, and the target text varies between an entire library and a single sentence. The history of these statements can be traced back to Aristotle’s On Generation and Corruption and Cicero’s De natura deorum (On the Nature of the Gods), through Blaise Pascal and Jonathan Swift, and finally to modern statements with their iconic typewriters. In the early 20th century, Émile Borel and Arthur Eddington used the theorem to illustrate the timescales implicit in the foundations of statistical mechanics.

There is a straightforward proof of this theorem. As an introduction, recall that if two events are statistically independent, then the probability of both happening equals the product of the probabilities of each one happening independently. For example, if the chance of rain in Moscow on a particular day in the future is 0.4 and the chance of an earthquake in San Francisco on that same day is 0.00003, then the chance of both happening on that day is 0.4 × 0.00003 = 0.000012, assuming that they are indeed independent.

Suppose the typewriter has 50 keys, and the word to be typed is banana. If the keys are pressed randomly and independently, it means that each key has an equal chance of being pressed. Then, the chance that the first letter typed is ‘b’ is 1/50, and the chance that the second letter typed is a is also 1/50, and so on. Therefore, the chance of the first six letters spelling banana is

(1/50) × (1/50) × (1/50) × (1/50) × (1/50) × (1/50) = (1/50)6 = 1/15 625 000 000 ,

less than one in 15 billion, but not zero, hence a possible outcome.

From the above, the chance of not typing banana in a given block of 6 letters is 1 − (1/50)6. Because each block is typed independently, the chance Xn of not typing banana in any of the first n blocks of 6 letters is

As n grows, Xn gets smaller. For an n of a million, Xn is roughly 0.9999, but for an n of 10 billion Xn is roughly 0.53 and for an n of 100 billion it is roughly 0.0017. As n approaches infinity, the probabilityXn approaches zero; that is, by making n large enough, Xn can be made as small as is desired, and the chance of typing banana approaches 100%.

The same argument shows why at least one of infinitely many monkeys will produce a text as quickly as it would be produced by a perfectly accurate human typist copying it from the original. In this case Xn = (1 − (1/50)6)n where Xn represents the probability that none of the first n monkeys types banana correctly on their first try. When we consider 100 billion monkeys, the probability falls to 0.17%, and as the number of monkeys n increases, the value of Xn – the probability of the monkeys failing to reproduce the given text – approaches zero arbitrarily closely. The limit, for n going to infinity, is zero.

However, for physically meaningful numbers of monkeys typing for physically meaningful lengths of time the results are reversed. If there are as many monkeys as there are particles in the observable universe (1080), and each types 1,000 keystrokes per second for 100 times the life of the universe (1020 seconds), the probability of the monkeys replicating even a short book is nearly zero.

Wikipedia

For more on Borel’s law and the fact that — still — incredibly unlikely things keep happening, see David Hands’s The Improbability Principle (Bantam Press, 2014).

Via con me

26 September, 2014 at 20:35 | Posted in Varia | Leave a comment

 

Senza una donna

26 September, 2014 at 17:48 | Posted in Varia | Leave a comment

 

INET — unabated faith in mathematical modelling

26 September, 2014 at 11:15 | Posted in Economics | 1 Comment

In the end, very few INET participants engage in a methodological critique that challenges the emphasis on modelling.amath One exception comes from Tony Lawson, participating at the opening conference in 2010, who is well known for his critique of the dominant economic methodology … Lawson makes an explicit link between the failure of economists to offer insights into the crisis, on the one hand, and the dominant economic methodology, on the other. In particular he points to an excessive preoccupation with mathematical modelling. Lawson’s comments below capture the intellectual tendency characterising INET events so far:

“Very many economists attended the conference, all apparently concerned critically to reconsider the nature of academic economics. It is in such a forum if anywhere that we might hope to find mainstream economists challenging all but the most obviously acceptable aspects of their theories, approaches and activities.

Although George Soros, who sponsors the Institute, shows some awareness that the reliance upon mathematics may at least be something to question … for most of his close associates the idea that there might be something problematic about the emphasis on forms of mathematical technique does not appear even to cross their minds …”

Thus, we find, to round off this section, that although INET is quite explicit about its concern with the state of economics, as well as about its search for alternatives, its overall orientation in the end (or so far) is not on a reduction in the emphasis on mathematical modelling. As things currently stand, the forum continues to show faith in the dominant economic methodological paradigm. …

Overall, we find that despite appearances, many economists across the board have tended to reaffirm their position. They do so primarily by a methodological critique that consists in advocating the development of newer, better mathematical models that this time, allegedly, achieve greater realisticness (i.e. achieve a closer match to reality), promising a greater ability to successfully predict. Representatively, Krugman adopts such a position. …

The question of whether mathematical tools are appropriate is something that, in the circumstances, we might have expected to receive significant attention. But this is not what we have found. Our study suggests rather that, even when recognising their discipline is in crisis, economists continue to take existing methodology as an unquestionable (sacrosanct) given.

Vinca Bigo & Iona Negru

The missing link in Keynes’ s General Theory

26 September, 2014 at 08:32 | Posted in Economics | Leave a comment

The cyclical succession of system states is not always clearly presented in The General Theory. In fact there are two distinct views of the business cycle, one a moderate cycle which can perhaps be identified with a dampened accelerator-multiplier cycle and the second a vigorous ‘boom and bust’ cycle … The business cycle in chapter 18 does not exhibit booms or crises …

jmkIn chapter 12 and 22, in the rebuttal to Viner, and in remarks throughout The General Theory, a vigorous cycle, which does have booms and crises, is described. However, nowhere in The General Theory or in Keynes’s few post-General Theory articles explicating his new theory are the boom and the crisis adequately defined or explained. The financial developments during a boom that makes a crisis likely, if not inevitable, are hinted at but not thoroughly examined. This is the logical hole, the missing link, in The General Theory as it was left by Keynes in 1937 after his rebuttal to Viner … In order to appreciate the full potential of The General Theory as a guide to interpretation and understanding of moderrn capitalism, we must fill out what Keynes discussed in a fragmentary and casual manner.

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.