Serially-correlated shocks and DSGE models (wonkish)

30 Sep, 2014 at 15:02 | Posted in Economics | 1 Comment

Now it is “dynamic stochastic general equilibrium” (DSGE) models inspired by the Lucas critique that have failed to predict or even explain the Great Recession of 2007–2009. More precisely, the implicit “explanations” based on these models are that the recession, including the millions of net jobs lost, was primarily due to large negative shocks to both technology and willingness to work … So can the reputation of modern macroeconomics be rehabilitated by simply modifying DSGE models to include a few more realistic shocks? …

A simple example helps illustrate for the uninitiated just how DSGE models work and why it should come as little surprise that they are largely inadequate for the task of explaining the Great Recession.

For this simple DSGE model, consider the following technical assumptions: i) an infinitely-lived representative agent with rational expectations and additive utility in current and discounted future log consumption and leisure; ii) a Cobb-Douglas aggregate production function with labor-augmenting technology; iii) capital accumulation with a fixed depreciation rate; and iv) a stochastic process for exogenous technology shocks …

wrong-tool-by-jerome-awIt is worth making two basic points about the setup. First, by construction, technology shocks are the only underlying source of fluctuations in this simple model. Thus, if we were to assume that U.S. real GDP was the literal outcome of this model, we would be assuming a priori that fluctuations in real GDP were ultimately due to technology. When faced with the Great Recession, this model would have no choice but to imply that technology shocks were somehow to blame. Second, despite the underlying role of technology, the observed fluctuations in real GDP can be divided into those that directly reflect the behavior of the exogenous shocks and those that reflect the endogenous capital accumulation in response to these shocks.

To be more precise about these two points, it is necessary to assume a particular process for the exogenous technology shocks. In this case, let’s assume technology follows a random walk with drift [and assuming a 100% depreciation rate of capital]…

So, with this simple DSGE model and for typical measures of the capital share, we have the implication that output growth follows an AR(1) process with an AR coefficient of about one third. This is notable given that such a time-series model does reasonably well as a parsimonious description of quarterly real GDP dynamics for the U.S. economy …

However, the rather absurd assumption of a 100% depreciation rate at the quarterly horizon would surely still have prompted a sharp question or two in a University of Chicago seminar back in the days. So, with this in mind, what happens if we consider the more general case?

binary-options-bunny-tophat-magicianUnfortunate-ly, for more realistic depreciation rates, we cannot solve the model analytically. Instead, taking a log-linearization around steady state, we can use standard methods to solve for output growth … This simple DSGE model is able to mimic the apparent AR(1) dynamics in real GDP growth. But it does so by assuming the exogenous technology shocks also follow an AR(1) process with an AR coefficient that happens to be the same as the estimated AR coefficient for output growth. Thus, the magic trick has been revealed: a rabbit was stuffed into the hat and then a rabbit jumped out of the hat …

Despite their increasing sophistication, DSGE models share one key thing in common with their RBC predecessors. After more than two decades of earnest promises to do better in the “future directions” sections of academic papers, they still have those serially-correlated shocks. Thus, the models now “explain” variables like real GDP, inflation, and interest rates as the outcome of more than just serially-correlated technology shocks. They also consider serially-correlated preference shocks and serially-correlated policy shocks …

James Morley

[h/t Brad DeLong & Merijn Knibbe]

Neoclassical economics and neoliberalism — two varieties of market fundamentalism

30 Sep, 2014 at 13:06 | Posted in Economics | Comments Off on Neoclassical economics and neoliberalism — two varieties of market fundamentalism

Oxford professor Simon Wren-Lewis had a post up some time ago commenting on traction gaining “attacks on mainstream economics”:

One frequent accusation … often repeated by heterodox economists, is that mainstream economics and neoliberal ideas are inextricably linked. Of course economics is used to support neoliberalism. Yet I find mainstream economics full of ideas and analysis that permits a wide ranging and deep critique of these same positions. The idea that the two live and die together is just silly.

Hmmm …

Silly? Maybe Wren-Lewis and other economists who want to enlighten themselves on the subject should take a look at this video:


Or maybe read this essay, where yours truly try to further analyze — much inspired by the works of Amartya Sen — what kind of philosophical-ideological-economic doctrine neoliberalism is, and why it so often comes natural for mainstream neoclassical economists to embrace neoliberal ideals.

den-dystra-vetenskapenOr — if you know some Swedish — you could take a look in this book on the connection between the dismal science and neoliberalism (sorry for shameless self-promotion).

NAIRU — a failed metaphor legitimizing austerity policies

29 Sep, 2014 at 13:09 | Posted in Economics, Politics & Society | 1 Comment

In our extended NAIRU model, labor productivity growth is included in the wage bargaining process … The logical consequence of this broadening of the theoretical canvas has been that the NAIRU becomes endogenous itself and ceases to be an attractor — Milton Friedman’s natural, stable and timeless equilibrium point from which the system cannot permanently deviate. In our model, a deviation from the initial equilibrium affects not only wages and prices (keeping the rest of the system unchanged) but also demand, technology, workers’ motivation, and work intensity; as a result, productivity growth and ultimately equilibrium unemployment will change. There is in other words, nothing natural or inescapable about equilibrium unemployment, as is Friedman’s presumption, following Wicksell; rather, the NAIRU is a social construct, fluctuating in response to fiscal and monetary policies and labor market interventions. Its ephemeral (rather than structural) nature may explain why the best economists working on the NAIRU have persistently failed to agree on how high the NAIRU actually is and how to estimate it.

Servaas Storm & C. W. M. Naastepad

jobbluffenNAIRU has been the subject of much heated discussion and debate in Sweden lately, after SVT, the Swedish national public TV broadcaster, aired a documentary on NAIRU and the fact that many politicians  — and economists — subscribe to the NAIRU story and its policy implication that attempts to promote full employment is doomed to fail, since governments and central banks can’t push unemployment below the critical NAIRU threshold without causing harmful runaway inflation.

One of  the main problems with NAIRU is that it essentially  is a timeless long-run equilibrium attractor to which actual unemployment (allegedly) has to adjust. But if that equilibrium is itself changing — and in ways that depend on the process of getting to the equilibrium —  well, then we can’t really be sure what that equlibrium will be without contextualizing unemployment in real historical time. And when we do, we will — as highlighted by Storm and Naastepad — see how seriously wrong we go if we omit demand from the analysis. Demand  policy has long-run effects and matters also for structural unemployment — and governments and central banks can’t just look the other way and legitimize their passivity re unemploy-ment by refering to NAIRU.

NAIRU does not hold water simply because it does not exist — and to base economic policy on such a weak theoretical and empirical construct is nothing short of writing out a prescription for self-inflicted economic havoc.

New Keynesianism — a macroeconomic cul-de-sac

28 Sep, 2014 at 15:55 | Posted in Economics | Comments Off on New Keynesianism — a macroeconomic cul-de-sac

Macroeconomic models may be an informative tool for research. But if practitioners of “New Keynesian” macroeconomics do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of “New Keynesian” macroeconomics. So far, I cannot really see that it has yielded very much in terms of realistic and relevant economic knowledge.

kKeynes basically argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Dynamic stochastic general euilibrium (DSGE) macroeconomists – including “New Keynesians” – has drawn the conclusion that the problem with unstable relations is to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of post (e. g. here and here), this, however, is a dead end.

Here we are getting close to the heart of darkness in “New Keynesian” macroeconomics. Where “New Keynesian” economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they have to put a blind eye on the emergent properties that characterize all open social systems – including the economic system. The interaction between animal spirits, trust, confidence, institutions etc., cannot be deduced or reduced to a question answerable on the individual level. Macroeconomic structures and phenomena have to be analyzed also on their own terms. And although one may easily agree with e.g. Paul Krugman’s emphasis on simple models, the simplifications used may have to be simplifications adequate for macroeconomics and not those adequate for microeconomics.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, even these microfoundations aren’t immutable. The “deep parameters” of “New Keynesian” DSGE models– “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

So I cannot concur with Paul Krugman, Mike Woodford, Greg Mankiw and other sorta-kinda “New Keynesians” when they more or less try to reduce Keynesian economics to “intertemporal maximization modified with sticky prices and a few other deviations”. ” As John Quiggin so aptly writes:

If there is one thing that distinguished Keynes’ economic analysis from that of his predecessors, it was his rejection of the idea of a unique full employment equilibrium to which a market economy will automatically return when it experiences a shock. Keynes argued that an economy could shift from a full-employment equilibrium to a persistent slump as the result of the interaction between objective macroeconomic variables and the subjective ‘animal spirits’ of investors and other decision-makers. It is this perspective that has been lost in the absorption of New Keynesian macro into the DSGE framework.

Reforming economics curriculum

27 Sep, 2014 at 20:32 | Posted in Economics | Comments Off on Reforming economics curriculum

When the global economy crashed in 2008, the list of culprits was long, including dozy regulators, greedy bankers and feckless subprime borrowers. Now the dismal science itself is in the dock, with much soul-searching over why economists failed to predict the financial crisis. One of the outcomes of this debate is that economics students are demanding the reform of a curriculum they think sustains a selfish strain of capitalism and is dominated by abstract mathematics. It looks like the students will get their way. A new curriculum, designed at the University of Oxford, is being tried out. This is good news. …

heilThe typical economics course starts with the study of how rational agents interact in frictionless markets, producing an outcome that is best for everyone. Only later does it cover those wrinkles and perversities that characterise real economic behaviour, such as anti-competitive practices or unstable financial markets. As students advance, there is a growing bias towards mathematical elegance. When the uglier real world intrudes, it only prompts the question: this is all very well in practice but how does it work in theory? …

Fortunately, the steps needed to bring economics teaching into the real world do not require the invention of anything new or exotic. The curriculum should embrace economic history and pay more attention to unorthodox thinkers such as Joseph Schumpeter, Friedrich Hayek and – yes – even Karl Marx. Faculties need to restore links with other fields such as psychology and anthropology, whose insights can explain phenomena that economics cannot. Economics professors should make the study of imperfect competition – and of how people act in conditions of uncertainty – the starting point of courses, not an afterthought. …

Economics should not be taught as if it were about the discovery of timeless laws. Those who champion the discipline must remember that, at its core, it is about human behaviour, with all the messiness and disorder that this implies.

Financial Times

Borel’s law and the infinite monkey theorem (wonkish)

27 Sep, 2014 at 10:36 | Posted in Statistics & Econometrics | 3 Comments

Back in 1943, eminent French mathematician Émile Borel published a book titled Les probabilités et la vie, in which he introduced what has been called Borel’s law : “Events with a sufficiently small probability never occur.”

Borel’s law has also been called the infinite monkey theorem since Borel illustrated his thinking using the classic example with monkeys randomly hitting the keys of a typewriter and by chance producing the complete works of Shakespeare:

Such is the sort of event which, though its impossibility may not be rationally demonstrable, is, however, so unlikely that no sensible person will hesitate to declare it actually impossible. If someone affirms having observed such an event we would be sure that he is deceiving us or has himself been the victim of fraud.

034_infinite_monkey_theorem

Wikipedia gives the historical background and a proof of the theorem:

Variants of the theorem include multiple and even infinitely many typists, and the target text varies between an entire library and a single sentence. The history of these statements can be traced back to Aristotle’s On Generation and Corruption and Cicero’s De natura deorum (On the Nature of the Gods), through Blaise Pascal and Jonathan Swift, and finally to modern statements with their iconic typewriters. In the early 20th century, Émile Borel and Arthur Eddington used the theorem to illustrate the timescales implicit in the foundations of statistical mechanics.

There is a straightforward proof of this theorem. As an introduction, recall that if two events are statistically independent, then the probability of both happening equals the product of the probabilities of each one happening independently. For example, if the chance of rain in Moscow on a particular day in the future is 0.4 and the chance of an earthquake in San Francisco on that same day is 0.00003, then the chance of both happening on that day is 0.4 × 0.00003 = 0.000012, assuming that they are indeed independent.

Suppose the typewriter has 50 keys, and the word to be typed is banana. If the keys are pressed randomly and independently, it means that each key has an equal chance of being pressed. Then, the chance that the first letter typed is ‘b’ is 1/50, and the chance that the second letter typed is a is also 1/50, and so on. Therefore, the chance of the first six letters spelling banana is

(1/50) × (1/50) × (1/50) × (1/50) × (1/50) × (1/50) = (1/50)6 = 1/15 625 000 000 ,

less than one in 15 billion, but not zero, hence a possible outcome.

From the above, the chance of not typing banana in a given block of 6 letters is 1 − (1/50)6. Because each block is typed independently, the chance Xn of not typing banana in any of the first n blocks of 6 letters is

As n grows, Xn gets smaller. For an n of a million, Xn is roughly 0.9999, but for an n of 10 billion Xn is roughly 0.53 and for an n of 100 billion it is roughly 0.0017. As n approaches infinity, the probabilityXn approaches zero; that is, by making n large enough, Xn can be made as small as is desired, and the chance of typing banana approaches 100%.

The same argument shows why at least one of infinitely many monkeys will produce a text as quickly as it would be produced by a perfectly accurate human typist copying it from the original. In this case Xn = (1 − (1/50)6)n where Xn represents the probability that none of the first n monkeys types banana correctly on their first try. When we consider 100 billion monkeys, the probability falls to 0.17%, and as the number of monkeys n increases, the value of Xn – the probability of the monkeys failing to reproduce the given text – approaches zero arbitrarily closely. The limit, for n going to infinity, is zero.

However, for physically meaningful numbers of monkeys typing for physically meaningful lengths of time the results are reversed. If there are as many monkeys as there are particles in the observable universe (1080), and each types 1,000 keystrokes per second for 100 times the life of the universe (1020 seconds), the probability of the monkeys replicating even a short book is nearly zero.

Wikipedia

For more on Borel’s law and the fact that — still — incredibly unlikely things keep happening, see David Hands’s The Improbability Principle (Bantam Press, 2014).

Via con me

26 Sep, 2014 at 20:35 | Posted in Varia | Comments Off on Via con me

 

Senza una donna

26 Sep, 2014 at 17:48 | Posted in Varia | Comments Off on Senza una donna

 

INET — unabated faith in mathematical modelling

26 Sep, 2014 at 11:15 | Posted in Economics | 1 Comment

In the end, very few INET participants engage in a methodological critique that challenges the emphasis on modelling.amath One exception comes from Tony Lawson, participating at the opening conference in 2010, who is well known for his critique of the dominant economic methodology … Lawson makes an explicit link between the failure of economists to offer insights into the crisis, on the one hand, and the dominant economic methodology, on the other. In particular he points to an excessive preoccupation with mathematical modelling. Lawson’s comments below capture the intellectual tendency characterising INET events so far:

“Very many economists attended the conference, all apparently concerned critically to reconsider the nature of academic economics. It is in such a forum if anywhere that we might hope to find mainstream economists challenging all but the most obviously acceptable aspects of their theories, approaches and activities.

Although George Soros, who sponsors the Institute, shows some awareness that the reliance upon mathematics may at least be something to question … for most of his close associates the idea that there might be something problematic about the emphasis on forms of mathematical technique does not appear even to cross their minds …”

Thus, we find, to round off this section, that although INET is quite explicit about its concern with the state of economics, as well as about its search for alternatives, its overall orientation in the end (or so far) is not on a reduction in the emphasis on mathematical modelling. As things currently stand, the forum continues to show faith in the dominant economic methodological paradigm. …

Overall, we find that despite appearances, many economists across the board have tended to reaffirm their position. They do so primarily by a methodological critique that consists in advocating the development of newer, better mathematical models that this time, allegedly, achieve greater realisticness (i.e. achieve a closer match to reality), promising a greater ability to successfully predict. Representatively, Krugman adopts such a position. …

The question of whether mathematical tools are appropriate is something that, in the circumstances, we might have expected to receive significant attention. But this is not what we have found. Our study suggests rather that, even when recognising their discipline is in crisis, economists continue to take existing methodology as an unquestionable (sacrosanct) given.

Vinca Bigo & Iona Negru

The missing link in Keynes’ s General Theory

26 Sep, 2014 at 08:32 | Posted in Economics | Comments Off on The missing link in Keynes’ s General Theory

The cyclical succession of system states is not always clearly presented in The General Theory. In fact there are two distinct views of the business cycle, one a moderate cycle which can perhaps be identified with a dampened accelerator-multiplier cycle and the second a vigorous ‘boom and bust’ cycle … The business cycle in chapter 18 does not exhibit booms or crises …

jmkIn chapter 12 and 22, in the rebuttal to Viner, and in remarks throughout The General Theory, a vigorous cycle, which does have booms and crises, is described. However, nowhere in The General Theory or in Keynes’s few post-General Theory articles explicating his new theory are the boom and the crisis adequately defined or explained. The financial developments during a boom that makes a crisis likely, if not inevitable, are hinted at but not thoroughly examined. This is the logical hole, the missing link, in The General Theory as it was left by Keynes in 1937 after his rebuttal to Viner … In order to appreciate the full potential of The General Theory as a guide to interpretation and understanding of moderrn capitalism, we must fill out what Keynes discussed in a fragmentary and casual manner.

Further reasons to reject NAIRU lock, stock, and barrel

25 Sep, 2014 at 15:19 | Posted in Economics | Comments Off on Further reasons to reject NAIRU lock, stock, and barrel

This paper has reasserted the Post Keynesian view that unemployment is essentially driven by private investment behaviour. There is a feedback from the labour market via price and wage inflation to the goods market, but it is weak. Without government policy the goods market reactions may even be perverse and, as we are presently reminded, the scope of monetary policy is limited in times of financial crises and in times of deflation. Second, the labour market itself is more adaptive than commonly assumed. The NAIRU is endogenous due to the supply-side effects of capital accumulation and the importance of social norms in wage setting. Thus, there is a well defined NAIRU that determines wage and price inflation (in conjunction with actual unemployment) in the short term, but it is endogenous and changes along with actual unemployment in the medium term. …

lock-stock-and-two-smoking-barrels-poster-bigWhile monetary policy exerts some impact on investment decisions, there may be other reasons for private investment to fall below the level necessary for full employment. Keynes himself had famously argued that it is mostly driven by animal spirits, which leaves the economic analyst in the dark as to what actually drives them. To some extent these animal spirits will depend on specific institutional structures and the degree of uncertainty regarding the future evolution of important macroeconomic variables … or corporate governance structures; but overall it is fair to say that investment expenditures cannot be easily reduced to underlying variables.

Our analysis has important policy implications. Rather than regarding the role of the state as having to provide conditions (in the labour market) as close as possible to perfect markets, our analysis highlights the role of the state as a mediator of social conflict and as a stabiliser of economic activity. If the private sector is prone to long-lasting swings in economic activity (due to changes in animal spirits or the aftermath of financial crises) and the NAIRU is endogenous, maintaining employment at high level in the short run is crucial. To that end monetary policy will in general not be sufficient and an active (counter cyclical) fiscal policy is needed. Finally, wage policy is crucial in terms of controlling inflation as well as in terms of stabilizing income distribution. Wage flexibility will not cure unemployment … Fiscal policy is the main tool of short run stabilization and wages policy aims at wages growth in line with labour productivity.

Engelbert Stockhammer

‘Natural rate of unemployment’ — a fatal fallacy

25 Sep, 2014 at 08:48 | Posted in Economics | Comments Off on ‘Natural rate of unemployment’ — a fatal fallacy

It is thought necessary to keep unemployment at a “non-inflation-accelerating” level (“NIARU”) in the range of 4% to 6% if inflation is to be kept from increasing unacceptably. …

William_VickreyThe underlying assumption that there is an exogenous NIARU imposing an unavoidable constraint on macroeconomic possibilities is open to serious question on both historical and analytical grounds. Historically, the U.S. enjoyed an unemployment rate of 1.8% for 1926 as a whole with the price level falling, if anything. West Germany enjoyed an unemployment rate of around 0.6% over the several years around 1960, and most developed countries have enjoyed episodes of unemployment under 2% without serious inflation. Thus a NIARU, if it exists at all, must be regarded as highly variable over time and place. It is not clear that estimates of the NIARU have not been contaminated by failure to allow for a possible impact of inflation on employment as well as the impact of unemployment on inflation. A Marxist interpretation of the insistence on a NIARU might be as a stalking horse to enlist the fear of inflation to justify the maintenance of a “reserve army of the unemployed,” allegedly to keep wages from initiating a “wage-price spiral.” One never hears of a “rent-price spiral”, or an “interest-price spiral,” though these costs are also to be considered in the setting of prices. Indeed when the FRB raises interest rates in an attempt to ward off inflation, the increase in interest costs to merchants may well trigger a small price increase. …

Indeed, if we are to control three major macroeconomic dimensions of the economy, namely the inflation rate, the unemployment rate, and the growth rate, a third control is needed that will be reasonably non-collinear in its effects to those of a fiscal policy operating through disposable income generation on the one hand, and monetary policy operating through interest rates on the other.

What may be needed is a method of directly controlling inflation that do not interfere with free market adjustments in relative prices or rely on unemployment to keep inflation in check. Without such a control, unanticipated changes in the rate of inflation, either up or down, will continue to plague the economy and make planning for investment difficult. Trying to control an economy in three major macroeconomic dimensions with only two instruments is like trying to fly an airplane with elevator and rudder but no ailerons; in calm weather and with sufficient dihedral one can manage if turns are made very gingerly, but trying to land in a cross-wind is likely to produce a crash. …

It is important to keep in mind that divergences in the rate of inflation either up or down, from what was previously expected, produce merely an arbitrary redistribution of a given total product, equivalent at worst to legitimized embezzlement, unless indeed these unpredictable variations are so extreme and rapid as to destroy the usefulness of currency as a means of exchange. Unemployment, on the other hand, reduces the total product to be distributed; it is at best equivalent to vandalism, and when it contributes to crime it becomes the equivalent of homicidal arson. In the U.S. the widespread availability of automatic teller machines in supermarkets and elsewhere would make the “shoe-leather cost” of a high but predictable inflation rate quite negligible.

William Vickrey

[h/t Jan Milch]

Ditch the NAIRU!

24 Sep, 2014 at 21:41 | Posted in Economics | 5 Comments

The most important implication of [the conventional NAIRU equation], however, is that there is no role whatsoever for demand factors in determining equilibrium unemployment. Any attempts by fiscal or monetary policy to permanently move (actual) unemployment away from its equilibrium level u* is doomed to failure. Policy may succeed in temporarily lowering unemployment, thus causing inflation, which in turn will undermine demand and raise unemployment until the equilibrium or “natural” rate of unemployment is reached again.

nairu

Demand will adjust itself to the “natural” level of output, corresponding to the rate of equilibrium unemployment, either passively through the so-called real balance effect or, alternatively, more actively through a policy-administered rise in interest rates; in the latter case, actual unemployment is determined by how large the central bank thinks the NAIRU is. The implication of [the conventional NAIRU equation] is that employment policy should focus exclusively on the labor market (and not on aggregate demand and investment), and above all on the behavior of labor unions and (mostly welfare state-related) wage–push factors. The policy recommendations are straightforward: to reduce unemployment, labor markets have to be deregulated; employment protection, labor taxes, and unemployment benefits have to be reduced; wage bargaining has to be decentralized; and welfare states have to be scaled down … However, although the view that labor market regulation explains OECD unemployment has become widely accepted, particularly in policy circles, it is by no means universally accepted. Serious problems remain …

Even authors working within the orthodox NAIRU approach are unable to explain (changes in long-run) unemployment in terms of only “excessive” labor market regulation. To explain (changes in) u*, most empirical studies consider it necessary to include other, additional “factors which might explain short-run deviations of unemployment from its equilibrium level” … the most important of which are aggregate demand shocks (i.e., import price and real interest rate shocks) and productivity shocks. The inclusion of such “shocks” is not an innocent amendment, because it turns out that a significant part of the OECD unemployment increase during the past three decades must be attributed to these shocks … This is obviously a dissatisfactory state of affairs: in the theoretical analysis,the impact of demand factors on equilibrium unemployment is defined away, but in the empirical analysis it has to be brought back in, not as a structural determinant but rather as an exogenous shock. We argue that this incongruence points to a misspecification of the NAIRU model.

Servaas Storm & C. W. M. Naastepad

Macroeconomics beyond NAIRU

24 Sep, 2014 at 08:16 | Posted in Economics | Comments Off on Macroeconomics beyond NAIRU

1001004011397322

Highly paid labour is generally efficient and therefore not dear labour; a fact, though it is more full of hope for the future of the human race than any other that is known to us, will be found to exercise a very complicating influence on the theory of distribution.

Alfred Marshall

 
 
 
 
 
 
 
 

Why be consistent?

23 Sep, 2014 at 08:16 | Posted in Economics | 1 Comment

consistentAxioms of ‘internal consistency’ of choice, such as the weak and the strong axioms of revealed preference … are often used in decision theory, micro-economics, game theory, social choice theory, and in related disciplines …

Paul Samuelson’s (1938) justly famous foundational contribution to revealed preference theory … can be interpreted in several different ways. One interpretation that has received much attention in the subsequent literature (and has had a profound impact on the direction of economic research) is the program of developing a theory of behavior “freed from any vestigial traces of the utility concept” (Samuelson (1938, p. 71)). While this was not in line with John Hicks’s earlier works, particularly his Value and Capital (Hicks (1939)), which began with the priority of the concept of preference or utility, Hicks too became persuaded by the alleged superiority of the new approach …

This paper argues against this influential approach to choice and behavior, and indicates the inescapable need to go beyond the internal features of a choice function to understand its cogency and consistency …

At the foundational level, the basic difficulty arises from the implicit presumption underlying that approach that acts of choice are, on their own, like statements which can contradict, or be consistent with, each other. That diagnosis is deeply problematic …

Can a set of choices really be seen as consistent or inconsistent on purely internal grounds, without bringing in something external to choice, such as the underlying objectives or values that are pursued or acknowledged by choice? …

The presumption of inconsistency may be easily disputed, depending on the context, if we know a bit more about what the person is trying to do. Suppose the person faces a choice at a dinner table between having the last remaining apple in the fruit basket (y) and having nothing instead (x), forgoing the nice-looking apple. She decides to behave decently and picks nothing (x), rather than the one apple (y). If, instead, the basket had contained two apples, and she had encountered the choice between having nothing (x), having one nice apple (y) and having another nice one (z), she could reasonably enough choose one (y), without violating any rule of good behavior. The presence of another apple (z) makes one of the two apples decently choosable, but this combination of choices would violate the standard consistency conditions, including Property a, even though there is nothing particularly “inconsistent” in this pair of choices (given her values and scruples) … We cannot determine whether the person is failing in any way without knowing what he is trying to do, that is, without knowing something external to the choice itself.

Amartya Sen

Next Page »

Blog at WordPress.com.
Entries and Comments feeds.