Serially-correlated shocks and DSGE models (wonkish)

30 September, 2014 at 15:02 | Posted in Economics | 1 Comment

Now it is “dynamic stochastic general equilibrium” (DSGE) models inspired by the Lucas critique that have failed to predict or even explain the Great Recession of 2007–2009. More precisely, the implicit “explanations” based on these models are that the recession, including the millions of net jobs lost, was primarily due to large negative shocks to both technology and willingness to work … So can the reputation of modern macroeconomics be rehabilitated by simply modifying DSGE models to include a few more realistic shocks? …

A simple example helps illustrate for the uninitiated just how DSGE models work and why it should come as little surprise that they are largely inadequate for the task of explaining the Great Recession.

For this simple DSGE model, consider the following technical assumptions: i) an infinitely-lived representative agent with rational expectations and additive utility in current and discounted future log consumption and leisure; ii) a Cobb-Douglas aggregate production function with labor-augmenting technology; iii) capital accumulation with a fixed depreciation rate; and iv) a stochastic process for exogenous technology shocks …

wrong-tool-by-jerome-awIt is worth making two basic points about the setup. First, by construction, technology shocks are the only underlying source of fluctuations in this simple model. Thus, if we were to assume that U.S. real GDP was the literal outcome of this model, we would be assuming a priori that fluctuations in real GDP were ultimately due to technology. When faced with the Great Recession, this model would have no choice but to imply that technology shocks were somehow to blame. Second, despite the underlying role of technology, the observed fluctuations in real GDP can be divided into those that directly reflect the behavior of the exogenous shocks and those that reflect the endogenous capital accumulation in response to these shocks.

To be more precise about these two points, it is necessary to assume a particular process for the exogenous technology shocks. In this case, let’s assume technology follows a random walk with drift [and assuming a 100% depreciation rate of capital]…

So, with this simple DSGE model and for typical measures of the capital share, we have the implication that output growth follows an AR(1) process with an AR coefficient of about one third. This is notable given that such a time-series model does reasonably well as a parsimonious description of quarterly real GDP dynamics for the U.S. economy …

However, the rather absurd assumption of a 100% depreciation rate at the quarterly horizon would surely still have prompted a sharp question or two in a University of Chicago seminar back in the days. So, with this in mind, what happens if we consider the more general case?

binary-options-bunny-tophat-magicianUnfortunate-ly, for more realistic depreciation rates, we cannot solve the model analytically. Instead, taking a log-linearization around steady state, we can use standard methods to solve for output growth … This simple DSGE model is able to mimic the apparent AR(1) dynamics in real GDP growth. But it does so by assuming the exogenous technology shocks also follow an AR(1) process with an AR coefficient that happens to be the same as the estimated AR coefficient for output growth. Thus, the magic trick has been revealed: a rabbit was stuffed into the hat and then a rabbit jumped out of the hat …

Despite their increasing sophistication, DSGE models share one key thing in common with their RBC predecessors. After more than two decades of earnest promises to do better in the “future directions” sections of academic papers, they still have those serially-correlated shocks. Thus, the models now “explain” variables like real GDP, inflation, and interest rates as the outcome of more than just serially-correlated technology shocks. They also consider serially-correlated preference shocks and serially-correlated policy shocks …

James Morley

[h/t Brad DeLong & Merijn Knibbe]

Neoclassical economics and neoliberalism — two varieties of market fundamentalism

30 September, 2014 at 13:06 | Posted in Economics | Comments Off on Neoclassical economics and neoliberalism — two varieties of market fundamentalism

Oxford professor Simon Wren-Lewis had a post up some time ago commenting on traction gaining “attacks on mainstream economics”:

One frequent accusation … often repeated by heterodox economists, is that mainstream economics and neoliberal ideas are inextricably linked. Of course economics is used to support neoliberalism. Yet I find mainstream economics full of ideas and analysis that permits a wide ranging and deep critique of these same positions. The idea that the two live and die together is just silly.

Hmmm …

Silly? Maybe Wren-Lewis and other economists who want to enlighten themselves on the subject should take a look at this video:

Or maybe read this essay, where yours truly try to further analyze — much inspired by the works of Amartya Sen — what kind of philosophical-ideological-economic doctrine neoliberalism is, and why it so often comes natural for mainstream neoclassical economists to embrace neoliberal ideals.

den-dystra-vetenskapenOr — if you know some Swedish — you could take a look in this book on the connection between the dismal science and neoliberalism (sorry for shameless self-promotion).

NAIRU — a failed metaphor legitimizing austerity policies

29 September, 2014 at 13:09 | Posted in Economics, Politics & Society | 1 Comment

In our extended NAIRU model, labor productivity growth is included in the wage bargaining process … The logical consequence of this broadening of the theoretical canvas has been that the NAIRU becomes endogenous itself and ceases to be an attractor — Milton Friedman’s natural, stable and timeless equilibrium point from which the system cannot permanently deviate. In our model, a deviation from the initial equilibrium affects not only wages and prices (keeping the rest of the system unchanged) but also demand, technology, workers’ motivation, and work intensity; as a result, productivity growth and ultimately equilibrium unemployment will change. There is in other words, nothing natural or inescapable about equilibrium unemployment, as is Friedman’s presumption, following Wicksell; rather, the NAIRU is a social construct, fluctuating in response to fiscal and monetary policies and labor market interventions. Its ephemeral (rather than structural) nature may explain why the best economists working on the NAIRU have persistently failed to agree on how high the NAIRU actually is and how to estimate it.

Servaas Storm & C. W. M. Naastepad

jobbluffenNAIRU has been the subject of much heated discussion and debate in Sweden lately, after SVT, the Swedish national public TV broadcaster, aired a documentary on NAIRU and the fact that many politicians  — and economists — subscribe to the NAIRU story and its policy implication that attempts to promote full employment is doomed to fail, since governments and central banks can’t push unemployment below the critical NAIRU threshold without causing harmful runaway inflation.

One of  the main problems with NAIRU is that it essentially  is a timeless long-run equilibrium attractor to which actual unemployment (allegedly) has to adjust. But if that equilibrium is itself changing — and in ways that depend on the process of getting to the equilibrium —  well, then we can’t really be sure what that equlibrium will be without contextualizing unemployment in real historical time. And when we do, we will — as highlighted by Storm and Naastepad — see how seriously wrong we go if we omit demand from the analysis. Demand  policy has long-run effects and matters also for structural unemployment — and governments and central banks can’t just look the other way and legitimize their passivity re unemploy-ment by refering to NAIRU.

NAIRU does not hold water simply because it does not exist — and to base economic policy on such a weak theoretical and empirical construct is nothing short of writing out a prescription for self-inflicted economic havoc.

New Keynesianism — a macroeconomic cul-de-sac

28 September, 2014 at 15:55 | Posted in Economics | Comments Off on New Keynesianism — a macroeconomic cul-de-sac

Macroeconomic models may be an informative tool for research. But if practitioners of “New Keynesian” macroeconomics do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of “New Keynesian” macroeconomics. So far, I cannot really see that it has yielded very much in terms of realistic and relevant economic knowledge.

kKeynes basically argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Dynamic stochastic general euilibrium (DSGE) macroeconomists – including “New Keynesians” – has drawn the conclusion that the problem with unstable relations is to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of post (e. g. here and here), this, however, is a dead end.

Here we are getting close to the heart of darkness in “New Keynesian” macroeconomics. Where “New Keynesian” economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they have to put a blind eye on the emergent properties that characterize all open social systems – including the economic system. The interaction between animal spirits, trust, confidence, institutions etc., cannot be deduced or reduced to a question answerable on the individual level. Macroeconomic structures and phenomena have to be analyzed also on their own terms. And although one may easily agree with e.g. Paul Krugman’s emphasis on simple models, the simplifications used may have to be simplifications adequate for macroeconomics and not those adequate for microeconomics.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, even these microfoundations aren’t immutable. The “deep parameters” of “New Keynesian” DSGE models– “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

So I cannot concur with Paul Krugman, Mike Woodford, Greg Mankiw and other sorta-kinda “New Keynesians” when they more or less try to reduce Keynesian economics to “intertemporal maximization modified with sticky prices and a few other deviations”. ” As John Quiggin so aptly writes:

If there is one thing that distinguished Keynes’ economic analysis from that of his predecessors, it was his rejection of the idea of a unique full employment equilibrium to which a market economy will automatically return when it experiences a shock. Keynes argued that an economy could shift from a full-employment equilibrium to a persistent slump as the result of the interaction between objective macroeconomic variables and the subjective ‘animal spirits’ of investors and other decision-makers. It is this perspective that has been lost in the absorption of New Keynesian macro into the DSGE framework.

Reforming economics curriculum

27 September, 2014 at 20:32 | Posted in Economics | Comments Off on Reforming economics curriculum

When the global economy crashed in 2008, the list of culprits was long, including dozy regulators, greedy bankers and feckless subprime borrowers. Now the dismal science itself is in the dock, with much soul-searching over why economists failed to predict the financial crisis. One of the outcomes of this debate is that economics students are demanding the reform of a curriculum they think sustains a selfish strain of capitalism and is dominated by abstract mathematics. It looks like the students will get their way. A new curriculum, designed at the University of Oxford, is being tried out. This is good news. …

heilThe typical economics course starts with the study of how rational agents interact in frictionless markets, producing an outcome that is best for everyone. Only later does it cover those wrinkles and perversities that characterise real economic behaviour, such as anti-competitive practices or unstable financial markets. As students advance, there is a growing bias towards mathematical elegance. When the uglier real world intrudes, it only prompts the question: this is all very well in practice but how does it work in theory? …

Fortunately, the steps needed to bring economics teaching into the real world do not require the invention of anything new or exotic. The curriculum should embrace economic history and pay more attention to unorthodox thinkers such as Joseph Schumpeter, Friedrich Hayek and – yes – even Karl Marx. Faculties need to restore links with other fields such as psychology and anthropology, whose insights can explain phenomena that economics cannot. Economics professors should make the study of imperfect competition – and of how people act in conditions of uncertainty – the starting point of courses, not an afterthought. …

Economics should not be taught as if it were about the discovery of timeless laws. Those who champion the discipline must remember that, at its core, it is about human behaviour, with all the messiness and disorder that this implies.

Financial Times

Borel’s law and the infinite monkey theorem (wonkish)

27 September, 2014 at 10:36 | Posted in Statistics & Econometrics | 3 Comments

Back in 1943, eminent French mathematician Émile Borel published a book titled Les probabilités et la vie, in which he introduced what has been called Borel’s law : “Events with a sufficiently small probability never occur.”

Borel’s law has also been called the infinite monkey theorem since Borel illustrated his thinking using the classic example with monkeys randomly hitting the keys of a typewriter and by chance producing the complete works of Shakespeare:

Such is the sort of event which, though its impossibility may not be rationally demonstrable, is, however, so unlikely that no sensible person will hesitate to declare it actually impossible. If someone affirms having observed such an event we would be sure that he is deceiving us or has himself been the victim of fraud.


Wikipedia gives the historical background and a proof of the theorem:

Variants of the theorem include multiple and even infinitely many typists, and the target text varies between an entire library and a single sentence. The history of these statements can be traced back to Aristotle’s On Generation and Corruption and Cicero’s De natura deorum (On the Nature of the Gods), through Blaise Pascal and Jonathan Swift, and finally to modern statements with their iconic typewriters. In the early 20th century, Émile Borel and Arthur Eddington used the theorem to illustrate the timescales implicit in the foundations of statistical mechanics.

There is a straightforward proof of this theorem. As an introduction, recall that if two events are statistically independent, then the probability of both happening equals the product of the probabilities of each one happening independently. For example, if the chance of rain in Moscow on a particular day in the future is 0.4 and the chance of an earthquake in San Francisco on that same day is 0.00003, then the chance of both happening on that day is 0.4 × 0.00003 = 0.000012, assuming that they are indeed independent.

Suppose the typewriter has 50 keys, and the word to be typed is banana. If the keys are pressed randomly and independently, it means that each key has an equal chance of being pressed. Then, the chance that the first letter typed is ‘b’ is 1/50, and the chance that the second letter typed is a is also 1/50, and so on. Therefore, the chance of the first six letters spelling banana is

(1/50) × (1/50) × (1/50) × (1/50) × (1/50) × (1/50) = (1/50)6 = 1/15 625 000 000 ,

less than one in 15 billion, but not zero, hence a possible outcome.

From the above, the chance of not typing banana in a given block of 6 letters is 1 − (1/50)6. Because each block is typed independently, the chance Xn of not typing banana in any of the first n blocks of 6 letters is

As n grows, Xn gets smaller. For an n of a million, Xn is roughly 0.9999, but for an n of 10 billion Xn is roughly 0.53 and for an n of 100 billion it is roughly 0.0017. As n approaches infinity, the probabilityXn approaches zero; that is, by making n large enough, Xn can be made as small as is desired, and the chance of typing banana approaches 100%.

The same argument shows why at least one of infinitely many monkeys will produce a text as quickly as it would be produced by a perfectly accurate human typist copying it from the original. In this case Xn = (1 − (1/50)6)n where Xn represents the probability that none of the first n monkeys types banana correctly on their first try. When we consider 100 billion monkeys, the probability falls to 0.17%, and as the number of monkeys n increases, the value of Xn – the probability of the monkeys failing to reproduce the given text – approaches zero arbitrarily closely. The limit, for n going to infinity, is zero.

However, for physically meaningful numbers of monkeys typing for physically meaningful lengths of time the results are reversed. If there are as many monkeys as there are particles in the observable universe (1080), and each types 1,000 keystrokes per second for 100 times the life of the universe (1020 seconds), the probability of the monkeys replicating even a short book is nearly zero.


For more on Borel’s law and the fact that — still — incredibly unlikely things keep happening, see David Hands’s The Improbability Principle (Bantam Press, 2014).

Via con me

26 September, 2014 at 20:35 | Posted in Varia | Comments Off on Via con me


Senza una donna

26 September, 2014 at 17:48 | Posted in Varia | Comments Off on Senza una donna


INET — unabated faith in mathematical modelling

26 September, 2014 at 11:15 | Posted in Economics | 1 Comment

In the end, very few INET participants engage in a methodological critique that challenges the emphasis on modelling.amath One exception comes from Tony Lawson, participating at the opening conference in 2010, who is well known for his critique of the dominant economic methodology … Lawson makes an explicit link between the failure of economists to offer insights into the crisis, on the one hand, and the dominant economic methodology, on the other. In particular he points to an excessive preoccupation with mathematical modelling. Lawson’s comments below capture the intellectual tendency characterising INET events so far:

“Very many economists attended the conference, all apparently concerned critically to reconsider the nature of academic economics. It is in such a forum if anywhere that we might hope to find mainstream economists challenging all but the most obviously acceptable aspects of their theories, approaches and activities.

Although George Soros, who sponsors the Institute, shows some awareness that the reliance upon mathematics may at least be something to question … for most of his close associates the idea that there might be something problematic about the emphasis on forms of mathematical technique does not appear even to cross their minds …”

Thus, we find, to round off this section, that although INET is quite explicit about its concern with the state of economics, as well as about its search for alternatives, its overall orientation in the end (or so far) is not on a reduction in the emphasis on mathematical modelling. As things currently stand, the forum continues to show faith in the dominant economic methodological paradigm. …

Overall, we find that despite appearances, many economists across the board have tended to reaffirm their position. They do so primarily by a methodological critique that consists in advocating the development of newer, better mathematical models that this time, allegedly, achieve greater realisticness (i.e. achieve a closer match to reality), promising a greater ability to successfully predict. Representatively, Krugman adopts such a position. …

The question of whether mathematical tools are appropriate is something that, in the circumstances, we might have expected to receive significant attention. But this is not what we have found. Our study suggests rather that, even when recognising their discipline is in crisis, economists continue to take existing methodology as an unquestionable (sacrosanct) given.

Vinca Bigo & Iona Negru

The missing link in Keynes’ s General Theory

26 September, 2014 at 08:32 | Posted in Economics | Comments Off on The missing link in Keynes’ s General Theory

The cyclical succession of system states is not always clearly presented in The General Theory. In fact there are two distinct views of the business cycle, one a moderate cycle which can perhaps be identified with a dampened accelerator-multiplier cycle and the second a vigorous ‘boom and bust’ cycle … The business cycle in chapter 18 does not exhibit booms or crises …

jmkIn chapter 12 and 22, in the rebuttal to Viner, and in remarks throughout The General Theory, a vigorous cycle, which does have booms and crises, is described. However, nowhere in The General Theory or in Keynes’s few post-General Theory articles explicating his new theory are the boom and the crisis adequately defined or explained. The financial developments during a boom that makes a crisis likely, if not inevitable, are hinted at but not thoroughly examined. This is the logical hole, the missing link, in The General Theory as it was left by Keynes in 1937 after his rebuttal to Viner … In order to appreciate the full potential of The General Theory as a guide to interpretation and understanding of moderrn capitalism, we must fill out what Keynes discussed in a fragmentary and casual manner.

Further reasons to reject NAIRU lock, stock, and barrel

25 September, 2014 at 15:19 | Posted in Economics | Comments Off on Further reasons to reject NAIRU lock, stock, and barrel

This paper has reasserted the Post Keynesian view that unemployment is essentially driven by private investment behaviour. There is a feedback from the labour market via price and wage inflation to the goods market, but it is weak. Without government policy the goods market reactions may even be perverse and, as we are presently reminded, the scope of monetary policy is limited in times of financial crises and in times of deflation. Second, the labour market itself is more adaptive than commonly assumed. The NAIRU is endogenous due to the supply-side effects of capital accumulation and the importance of social norms in wage setting. Thus, there is a well defined NAIRU that determines wage and price inflation (in conjunction with actual unemployment) in the short term, but it is endogenous and changes along with actual unemployment in the medium term. …

lock-stock-and-two-smoking-barrels-poster-bigWhile monetary policy exerts some impact on investment decisions, there may be other reasons for private investment to fall below the level necessary for full employment. Keynes himself had famously argued that it is mostly driven by animal spirits, which leaves the economic analyst in the dark as to what actually drives them. To some extent these animal spirits will depend on specific institutional structures and the degree of uncertainty regarding the future evolution of important macroeconomic variables … or corporate governance structures; but overall it is fair to say that investment expenditures cannot be easily reduced to underlying variables.

Our analysis has important policy implications. Rather than regarding the role of the state as having to provide conditions (in the labour market) as close as possible to perfect markets, our analysis highlights the role of the state as a mediator of social conflict and as a stabiliser of economic activity. If the private sector is prone to long-lasting swings in economic activity (due to changes in animal spirits or the aftermath of financial crises) and the NAIRU is endogenous, maintaining employment at high level in the short run is crucial. To that end monetary policy will in general not be sufficient and an active (counter cyclical) fiscal policy is needed. Finally, wage policy is crucial in terms of controlling inflation as well as in terms of stabilizing income distribution. Wage flexibility will not cure unemployment … Fiscal policy is the main tool of short run stabilization and wages policy aims at wages growth in line with labour productivity.

Engelbert Stockhammer

‘Natural rate of unemployment’ — a fatal fallacy

25 September, 2014 at 08:48 | Posted in Economics | Comments Off on ‘Natural rate of unemployment’ — a fatal fallacy

It is thought necessary to keep unemployment at a “non-inflation-accelerating” level (“NIARU”) in the range of 4% to 6% if inflation is to be kept from increasing unacceptably. …

William_VickreyThe underlying assumption that there is an exogenous NIARU imposing an unavoidable constraint on macroeconomic possibilities is open to serious question on both historical and analytical grounds. Historically, the U.S. enjoyed an unemployment rate of 1.8% for 1926 as a whole with the price level falling, if anything. West Germany enjoyed an unemployment rate of around 0.6% over the several years around 1960, and most developed countries have enjoyed episodes of unemployment under 2% without serious inflation. Thus a NIARU, if it exists at all, must be regarded as highly variable over time and place. It is not clear that estimates of the NIARU have not been contaminated by failure to allow for a possible impact of inflation on employment as well as the impact of unemployment on inflation. A Marxist interpretation of the insistence on a NIARU might be as a stalking horse to enlist the fear of inflation to justify the maintenance of a “reserve army of the unemployed,” allegedly to keep wages from initiating a “wage-price spiral.” One never hears of a “rent-price spiral”, or an “interest-price spiral,” though these costs are also to be considered in the setting of prices. Indeed when the FRB raises interest rates in an attempt to ward off inflation, the increase in interest costs to merchants may well trigger a small price increase. …

Indeed, if we are to control three major macroeconomic dimensions of the economy, namely the inflation rate, the unemployment rate, and the growth rate, a third control is needed that will be reasonably non-collinear in its effects to those of a fiscal policy operating through disposable income generation on the one hand, and monetary policy operating through interest rates on the other.

What may be needed is a method of directly controlling inflation that do not interfere with free market adjustments in relative prices or rely on unemployment to keep inflation in check. Without such a control, unanticipated changes in the rate of inflation, either up or down, will continue to plague the economy and make planning for investment difficult. Trying to control an economy in three major macroeconomic dimensions with only two instruments is like trying to fly an airplane with elevator and rudder but no ailerons; in calm weather and with sufficient dihedral one can manage if turns are made very gingerly, but trying to land in a cross-wind is likely to produce a crash. …

It is important to keep in mind that divergences in the rate of inflation either up or down, from what was previously expected, produce merely an arbitrary redistribution of a given total product, equivalent at worst to legitimized embezzlement, unless indeed these unpredictable variations are so extreme and rapid as to destroy the usefulness of currency as a means of exchange. Unemployment, on the other hand, reduces the total product to be distributed; it is at best equivalent to vandalism, and when it contributes to crime it becomes the equivalent of homicidal arson. In the U.S. the widespread availability of automatic teller machines in supermarkets and elsewhere would make the “shoe-leather cost” of a high but predictable inflation rate quite negligible.

William Vickrey

[h/t Jan Milch]

Ditch the NAIRU!

24 September, 2014 at 21:41 | Posted in Economics | 5 Comments

The most important implication of [the conventional NAIRU equation], however, is that there is no role whatsoever for demand factors in determining equilibrium unemployment. Any attempts by fiscal or monetary policy to permanently move (actual) unemployment away from its equilibrium level u* is doomed to failure. Policy may succeed in temporarily lowering unemployment, thus causing inflation, which in turn will undermine demand and raise unemployment until the equilibrium or “natural” rate of unemployment is reached again.


Demand will adjust itself to the “natural” level of output, corresponding to the rate of equilibrium unemployment, either passively through the so-called real balance effect or, alternatively, more actively through a policy-administered rise in interest rates; in the latter case, actual unemployment is determined by how large the central bank thinks the NAIRU is. The implication of [the conventional NAIRU equation] is that employment policy should focus exclusively on the labor market (and not on aggregate demand and investment), and above all on the behavior of labor unions and (mostly welfare state-related) wage–push factors. The policy recommendations are straightforward: to reduce unemployment, labor markets have to be deregulated; employment protection, labor taxes, and unemployment benefits have to be reduced; wage bargaining has to be decentralized; and welfare states have to be scaled down … However, although the view that labor market regulation explains OECD unemployment has become widely accepted, particularly in policy circles, it is by no means universally accepted. Serious problems remain …

Even authors working within the orthodox NAIRU approach are unable to explain (changes in long-run) unemployment in terms of only “excessive” labor market regulation. To explain (changes in) u*, most empirical studies consider it necessary to include other, additional “factors which might explain short-run deviations of unemployment from its equilibrium level” … the most important of which are aggregate demand shocks (i.e., import price and real interest rate shocks) and productivity shocks. The inclusion of such “shocks” is not an innocent amendment, because it turns out that a significant part of the OECD unemployment increase during the past three decades must be attributed to these shocks … This is obviously a dissatisfactory state of affairs: in the theoretical analysis,the impact of demand factors on equilibrium unemployment is defined away, but in the empirical analysis it has to be brought back in, not as a structural determinant but rather as an exogenous shock. We argue that this incongruence points to a misspecification of the NAIRU model.

Servaas Storm & C. W. M. Naastepad

Macroeconomics beyond NAIRU

24 September, 2014 at 08:16 | Posted in Economics | Comments Off on Macroeconomics beyond NAIRU


Highly paid labour is generally efficient and therefore not dear labour; a fact, though it is more full of hope for the future of the human race than any other that is known to us, will be found to exercise a very complicating influence on the theory of distribution.

Alfred Marshall


Why be consistent?

23 September, 2014 at 08:16 | Posted in Economics | 1 Comment

consistentAxioms of ‘internal consistency’ of choice, such as the weak and the strong axioms of revealed preference … are often used in decision theory, micro-economics, game theory, social choice theory, and in related disciplines …

Paul Samuelson’s (1938) justly famous foundational contribution to revealed preference theory … can be interpreted in several different ways. One interpretation that has received much attention in the subsequent literature (and has had a profound impact on the direction of economic research) is the program of developing a theory of behavior “freed from any vestigial traces of the utility concept” (Samuelson (1938, p. 71)). While this was not in line with John Hicks’s earlier works, particularly his Value and Capital (Hicks (1939)), which began with the priority of the concept of preference or utility, Hicks too became persuaded by the alleged superiority of the new approach …

This paper argues against this influential approach to choice and behavior, and indicates the inescapable need to go beyond the internal features of a choice function to understand its cogency and consistency …

At the foundational level, the basic difficulty arises from the implicit presumption underlying that approach that acts of choice are, on their own, like statements which can contradict, or be consistent with, each other. That diagnosis is deeply problematic …

Can a set of choices really be seen as consistent or inconsistent on purely internal grounds, without bringing in something external to choice, such as the underlying objectives or values that are pursued or acknowledged by choice? …

The presumption of inconsistency may be easily disputed, depending on the context, if we know a bit more about what the person is trying to do. Suppose the person faces a choice at a dinner table between having the last remaining apple in the fruit basket (y) and having nothing instead (x), forgoing the nice-looking apple. She decides to behave decently and picks nothing (x), rather than the one apple (y). If, instead, the basket had contained two apples, and she had encountered the choice between having nothing (x), having one nice apple (y) and having another nice one (z), she could reasonably enough choose one (y), without violating any rule of good behavior. The presence of another apple (z) makes one of the two apples decently choosable, but this combination of choices would violate the standard consistency conditions, including Property a, even though there is nothing particularly “inconsistent” in this pair of choices (given her values and scruples) … We cannot determine whether the person is failing in any way without knowing what he is trying to do, that is, without knowing something external to the choice itself.

Amartya Sen

Keynes vs. Wicksell on loanable funds theory

22 September, 2014 at 13:41 | Posted in Economics | 8 Comments

WicksellThe fundamental difference between Keynes and Wicksell and in general the
supporters of the LFT [Loanable Funds Theory] lies in the specification of the consequences of the presence of bank money. Introducing the distinction between the natural rate of interest and interest rate on money, Wicksell and the LFT supporters state that an economy that uses bank money converges towards the equilibrium position that characterises an economy without banks, in which there is no credit market, but just a capital market where the resources not consumed by savers are exchanged. The presence of bank money does not alter the structure of the economic system; the only element that distinguishes a pure credit economy is the presence of an adjustment mechanism that drives the rate of interest on money, determined within the credit market, towards the natural rate of interest. The working of a pure credit economy can therefore be described using a theory that applies to a world without banks.

In contrast, Keynes states that the spread of a fiat money such as bank money changes the structure of the economic system. He underscores this point by introducing the distinction between a real exchange economy and a monetary economy. As is well known, Keynes uses the former term to refer to an economy in which money is merely a tool to reduce the cost of exchange and whose presence does not alter the structure of the economic system, which remains substantially a barter economy. Keynes notes that the classical economists formulated an explanation of how the real-exchange economy works, convinced that this explanation could be easily applied to a monetary economy. He believed that this conviction was unfounded …

Giancarlo Bertocco

Skola och samhälle

22 September, 2014 at 09:32 | Posted in Education & School | Comments Off on Skola och samhälle

lipsillYours truly har i senaste upplagan av nättidningen Skola och samhälle en artikel om vinster i skatte-finansierade friskolor. Grundfrågan är inte om dessa ska få göra vinstuttag eller om det krävs hårdare tag i form av kontroll och inspektion. Ytterst handlar det om huruvida marknadens eller demokratins logik ska få styra våra välfärdsinrättningar.

The loanable funds fallacy

21 September, 2014 at 18:51 | Posted in Economics | 16 Comments

The loanable funds theory is in many regards nothing but an approach where the ruling rate of interest in society is — pure and simple — conceived as nothing else than the price of loans or credit, determined by supply and demand — as Bertil Ohlin put it — “in the same way as the price of eggs and strawberries on a village market.”

loanIn the traditional loanable funds theory — as presented in maistream macroeconomics textbooks like e. g. Greg Mankiw’s — the amount of loans and credit available for financing investment is constrained by how much saving is available. Saving is the supply of loanable funds, investment is the demand for loanable funds and assumed to be negatively related to the interest rate. Lowering households’ consumption means increasing savings that via a lower interest increase investment.

Nick Rowe has presented a formalization of New Keynesian loanable funds reasoning on his blog that goes like this:

Let output demanded (call it Yd) be a negative function of the rate of interest r, a positive function of actual income Y, and a function of other stuff X.

Yd = D(r,Y,X)

And the ONKM [orthodox New Keynesian macroeconomist] central bank wants to set r such that output demanded equals potential output Y*, so that:

D(r,Y*,X) = Y*

Assume a closed economy for simplicity, subtract Cd (consumption demand) plus Gd (government demand) from both sides, remember the accounting identities C+I+G=Y and S=Y-C-G, where I is investment and S is national saving, and we get:

Id(r,Y*,X) = Sd(r,Y*,X)

The central bank sets a rate of interest such that desired investment at potential output equals desired national saving at potential output. Which is precisely the loanable funds theory of the rate of interest.

From a more Post-Keynesian-Minskyite point of view the problem with this formalization is quite obvious:

1 As already noticed by James Meade decades ago, the causal story told to explicate these accounting identities gives the picture of “a dog called saving wagged its tail labelled investment.” In Keynes’s view — and later over and over again confirmed by empirical research — it’s not so much the interest rate at which firms can borrow that causally determines the amount of investment undertaken, but rather their internal funds, profit expectations and capacity utilization.

2 As is typical of most mainstream macroeconomic formalizations and models, there is pretty little mention of real world phenomena, like e. g. real money, credit rationing and the existence of multiple interest rates, in the loanable funds theory. Loanable funds theory essentially reduces modern monetary economies to something akin to barter systems — something it definitely is not. As emphasized especially by Minsky, to understand and explain how much investment/loaning/crediting is going on in an economy, it’s much more important to focus on the working of financial markets than staring at accounting identities like S = Y-C-G. The problems we meet on modern markets today have more to do with inadequate financial institutions than with the size of loanable-funds-savings.

3 As clearly noticed by Rowe — “it would be more correct to say that the central bank sets the rate of interest where it thinks the loanable funds theory says it will be” — the loanable funds theory in the ONKM approach means that the interest rate is endogenized by assuming that Central Banks can (try to) adjust it in response to an eventual output gap. This, of course, is essentially nothing but an assumption of Walras’ law being valid and applicable, and that a fortiori the attainment of equilibrium is secured by the Central Banks’ interest rate adjustments. From a realist Keynes-Minsky point of view this can’t be considered anything else than a belief resting on nothing but sheer hope. [Not to mention that more ad more Central Banks actually choose not to follow Taylor-like policy rules.] The age-old belief that Central Banks control the money supply has more an more come to be questioned and replaced by an “endogenous” money view, and I think the same will happen to the view that Central Banks determine “the” rate of interest.

4 A further problem in the traditional loanable funds theory is that it assumes that saving and investment can be treated as independent entities. To Keynes this was seriously wrong:

gtThe classical theory of the rate of interest [the loanable funds theory] seems to suppose that, if the demand curve for capital shifts or if the curve relating the rate of interest to the amounts saved out of a given income shifts or if both these curves shift, the new rate of interest will be given by the point of intersection of the new positions of the two curves. But this is a nonsense theory. For the assumption that income is constant is inconsistent with the assumption that these two curves can shift independently of one another. If either of them shift, then, in general, income will change; with the result that the whole schematism based on the assumption of a given income breaks down … In truth, the classical theory has not been alive to the relevance of changes in the level of income or to the possibility of the level of income being actually a function of the rate of the investment.

There are always (at least) two parts in an economic transaction. Savers and investors have different liquidity preferences and face different choices — and their interactions usually only take place intermediated by financial institutions. This, importantly, also means that there is no “direct and immediate” automatic interest mechanism at work in modern monetary economies. What this ultimately boils done to is — iter — that what happens at the microeconomic level — both in and out of equilibrium —  is not always compatible with the macroeconomic outcome. The fallacy of composition (the “atomistic fallacy” of Keynes) has many faces — loanable funds is one of them.

5 Contrary to the loanable funds theory, finance in the world of Keynes and Minsky precedes investment and saving. Highlighting the loanable funds fallacy, Keynes wrote in “The Process of Capital Formation” (1939):

Increased investment will always be accompanied by increased saving, but it can never be preceded by it. Dishoarding and credit expansion provides not an alternative to increased saving, but a necessary preparation for it. It is the parent, not the twin, of increased saving.

So, in way of conclusion, what I think New Keynesians — like Paul Krugman and Greg Mankiw — “forget” when they hold to the loanable funds theory, is the Keynes-Minsky wisdom of truly acknowledging that finance — in all its different shapes — has its own dimension, and if taken seriously, its effect on an analysis must modify the whole theoretical system and not just be added as an unsystematic appendage. Finance is fundamental to our understanding of modern economies, and acting like the baker’s apprentice who, having forgotten to add yeast to the dough, throws it into the oven afterwards, simply isn’t enough.

All real economic activities nowadays depend on a functioning financial machinery. But institutional arrangements, states of confidence, fundamental uncertainties, asymmetric expectations, the banking system, financial intermediation, loan granting processes, default risks, liquidity constraints, aggregate debt, cash flow fluctuations, etc., etc. — things that play decisive roles in channeling money/savings/credit — are more or less left in the dark in New Keynesian formalizations of the loanable funds theory to the real world target system.


Added 19:30 GMT: Nick Rowe has responded to this post here.

Added 20:00 GMT: Naked Keynesianism has an interesting post and link on the issue here.

Added September 30: Victoria Chick and Geoff Tily has a  good piece in CJE (March 2014) on the loanable funds theory (LPT) and the liquidity preference theory (LP) of interest, and why Keynes considered the two theories “radically opposed.” Since IS-LM can be shown to be essentially  equivalent to LPT, this is also a further argument to be sceptical of those who maintain that the Hicksian construct should be a good representation of Keynes’s thoughts.


19 September, 2014 at 14:14 | Posted in Politics & Society | Comments Off on Courage

John F. Kennedy once wrote:

To be courageous requires no exceptional qualifications, no magic formula, no special combination of time, place, and circumstance. It is an opportunity that sooner or later is presented to us all.

In spite of this, many would probably still maintain that courage is not anything very common, and that the value we put on it is a witness to its rarity.

Courage is a capability to confront fear, as when in front of the powerful and mighty, not to step back, but stand up for one’s rights not to be humiliated or abused in any ways by the rich and powerful.

Courage is to do the right thing in spite of danger and fear. To keep on even if opportunities to turn back are given. Like in the great stories. The ones where people have lots of chances of turning back — but don’t.

Dignity, a better life, or justice and rule of law, are things worth fighting for. Not to step back – in spite of confronting the mighty and powerful – creates courageous acts that stay in our memories and means something – as when Rosa Parks on December 1, 1955, in Montgomery, Alabama, refused to give up her seat to make room for a white passenger.

Uncertainty and reflexivity — two things missing from Krugman’s economics

18 September, 2014 at 09:36 | Posted in Theory of Science & Methodology | 7 Comments

One thing that’s missing from Krugman’s treatment of useful economics is the explicit recognition of what Keynes and before him Frank Knight, emphasized: the persistent presence of enormous uncertainty in the economy. Most people most of the time don’t just face quantifiable risks, to be tamed by statistics and probabilistic reasoning. We have to take decisions in the prospect of events–big and small–we can’t predict even with probabilities.uncertainty Keynes famously argued that classical economics had no role for money just because it didn’t allow for uncertainty. Knight similarly noted that it made no room for the entrepreneur owing to the same reason. That to this day standard economic theory continues to rules out money and excludes entrepreneurs may strike the noneconomist as odd to say the least. But there it is. Why is uncertainty so important? Because the more of it there is in the economy the less scope for successful maximizing and the more unstable are the equilibria the economy exhibits, if it exhibits any at all. Uncertainty is just what the New Classical neglected when they endorsed the efficient market hypothesis and the Black-Scholes formulae for pumping returns out of well-behaved risks.

If uncertainty is an ever present, pervasive feature of the economy, then we can be confident, along with Krugman, that New Classical models wont be useful over the long haul. Even if people are perfectly rational too many uncertain, “exogenous” events will divert each new equilibrium path before it can even get started.

There is a second feature of the economy that Krugman’s useful economics needs to reckon with, one that Keynes and after him George Soros, emphasized. Along with uncertainty, the economy exhibits pervasive reflexivity: expectations about the economic future tend to actually shift that future. This will be true whether those expectations are those of speculators, regulators, even garden-variety consumers and producers. Reflexiveness is everywhere in the economy, though it is only easily detectable when it goes to extremes, as in bubbles and busts, or regulatory capture …

When combined uncertainty and reflexivity greatly limit the power of maximizing and equilibrium to do useful economics … Between them, they make the economy a moving target for the economist. Models get into people’s heads and change their behavior, usually in ways that undermine the model’s usefulness to predict.

Which models do this and how they work is not a matter of quantifiable risk, but radical uncertainty …

Between them reflexivity and uncertainty make economics into a retrospective, historical science, one whose models—simple or complex—are continually made obsolete by events, and so cannot be improved in the direction of greater predictive power, even by more complication. The way expectations reflexively drive future economic events, and are driven by past ones, is constantly being changed by the intervention of unexpected, uncertain, exogenous ones.

Alex Rosenberg

[h/t Jan Milch]

Next Page »

Blog at
Entries and comments feeds.