Free — and powerless

1 October, 2014 at 23:26 | Posted in Politics & Society | 1 Comment

We tend to perceive our identities as stable and largely separate from outside forces. But over decades of research and therapeutic practice, I have become convinced that economic change is having a profound effect not only on our values but also on our personalities. Thirty years of neoliberalism, free-market forces and privatisation have taken their toll, as relentless pressure to achieve has become normative. If you’re reading this sceptically, I put this simple statement to you: meritocratic neoliberalism favours certain personality traits and penalises others …

neoliberalism-1The sociologist Zygmunt Bauman neatly summarised the paradox of our era as: “Never have we been so free. Never have we felt so power-less.” We are indeed freer than before, in the sense that we can criticise religion, take advantage of the new laissez-faire attitude to sex and support any political movement we like. We can do all these things because they no longer have any significance – freedom of this kind is prompted by indifference. Yet, on the other hand, our daily lives have become a constant battle against a bureaucracy that would make Kafka weak at the knees. There are regulations about everything, from the salt content of bread to urban poultry-keeping.

Our presumed freedom is tied to one central condition: we must be successful – that is, “make” something of ourselves. You don’t need to look far for examples. A highly skilled individual who puts parenting before their career comes in for criticism. A person with a good job who turns down a promotion to invest more time in other things is seen as crazy – unless those other things ensure success. A young woman who wants to become a primary school teacher is told by her parents that she should start off by getting a master’s degree in economics – a primary school teacher, whatever can she be thinking of?

There are constant laments about the so-called loss of norms and values in our culture. Yet our norms and values make up an integral and essential part of our identity. So they cannot be lost, only changed. And that is precisely what has happened: a changed economy reflects changed ethics and brings about changed identity. The current economic system is bringing out the worst in us.

Paul Verhaeghe/The Guardian

‘Modern’ macroeconomics and class struggle

1 October, 2014 at 20:19 | Posted in Economics, Politics & Society | Leave a comment

The risks associated with a negative economic shock can vary widely depending on the wealth of a household. Wealthy households can, of course, absorb a shock much easier than poorer households. Thus, it’s important to think about how economic downturns impact various groups within the economy, and how policy can be used to offset the problems experienced by the most vulnerable among us.

A public-sector worker striking in east London last year.When thinking about the effects of an increase in the Fed’s target interest rate, for example, it’s important to consider the impacts across income groups. I was very pleased to hear monetary policymakers talk about the asymmetric risks associated with increasing the interest rate too soon and slowing the recovery of employment and output, versus raising rates too late and risking inflation …

Which mistake is more costly – raising rates too soon versus too late – is not just a technical question about which of the two mistakes is easiest for policymakers to reverse. We also need to ask who will be hurt the most if the Fed makes a policy error on one side or the other. If the Fed raises rates too soon, it is working class households who will be hurt the most by the slower recovery of employment. If it raises rates too late allowing a period of elevated inflation, it is largely those who lend money, i.e. the wealthy, who will feel the impact. Thus, one mistake mostly affects working class households who are very vulnerable to negative shocks, while the other hurts those most able to withstand economic problems.
reaganomics_trickle_downI don’t mean to pick on monetary policy-makers. I have no doubt that monetary policymakers think about how the Fed’s policies will impact different income groups even if this is not explicit in their discussion of policy options. I also have no doubt that fiscal policymakers think about how their policies impact various income groups. For example, the whole idea behind “trickle-down” economics is that tax cuts motivate those at the top of the income distribution to undertake new economic initiatives that benefit working class households.

Somehow, by helping those least in need of help – the wealthy – we will end up helping those who need it the most. It hasn’t worked in practice — not much trickled down after all, but that hasn’t stopped conservatives from making these claims. Conservatives also think about the impact of fiscal policy on lower income groups and use this as a reason to block or scale back programs such as unemployment compensation or food stamps …

Why do we hear so much about the need to raise interest rates now rather than later, or get the deficit under control immediately despite the risks to households who are most vulnerable to an economic downturn? Those who are most in need – those least able to withstand a spell of unemployment or other negative economic events – have the least power in our political system …

If we are going to be a fair and just society, a society that protects those among us who are the most vulnerable to economic shocks, this needs to change. The necessary change won’t come easily, the entrenched political and economic interests will be difficult to dislodge.

But the current trend of rising inequality in both the economic and political arenas along with the rising economic risks faced by working class households due to globalization, technological change, and a political system that increasingly neglects their interests is not sustainable. If these trends continue unabated, change will come one way or the other. The only question is how.

MarkThoma

‘Infinite populations’ and other econometric fictions masquerading as science

1 October, 2014 at 17:24 | Posted in Statistics & Econometrics | Leave a comment

pulling_a_rabbit_out_of_a_hat_by_candiphoenixes-d3ee5jaIn econometrics one often gets the feeling that many of its practitioners think of it as a kind of automatic inferential machine: input data and out comes casual knowledge. This is like pulling a rabbit from a hat. Great — but first you have to put the rabbit in the hat. And this is where assumptions come in to the picture.

The assumption of imaginary “superpopulations” is one of the many dubious assumptions used in modern econometrics, and as Clint Ballinger has highlighted, this is a particularly questionable rabbit pulling assumption:

Inferential statistics are based on taking a random sample from a larger population … and attempting to draw conclusions about a) the larger population from that data and b) the probability that the relations between measured variables are consistent or are artifacts of the sampling procedure.

However, in political science, economics, development studies and related fields the data often represents as complete an amount of data as can be measured from the real world (an ‘apparent population’). It is not the result of a random sampling from a larger population. Nevertheless, social scientists treat such data as the result of random sampling.

Because there is no source of further cases a fiction is propagated—the data is treated as if it were from a larger population, a ‘superpopulation’ where repeated realizations of the data are imagined. Imagine there could be more worlds with more cases and the problem is fixed …

What ‘draw’ from this imaginary superpopulation does the real-world set of cases we have in hand represent? This is simply an unanswerable question. The current set of cases could be representative of the superpopulation, and it could be an extremely unrepresentative sample, a one in a million chance selection from it …

The problem is not one of statistics that need to be fixed. Rather, it is a problem of the misapplication of inferential statistics to non-inferential situations.

As social scientists – and economists – we have to confront the all-important question of how to handle uncertainty and randomness. Should we define randomness with probability? If we do, we have to accept that to speak of randomness we also have to presuppose the existence of nomological probability machines, since probabilities cannot be spoken of – and actually, to be strict, do not at all exist – without specifying such system-contexts. Accepting Haavelmo’s domain of probability theory and sample space of infinite populations – just as Fisher’s “hypothetical infinite population, of which the actual data are regarded as constituting a random sample”, von Mises’s “collective” or Gibbs’s ”ensemble” – also implies that judgments are made on the basis of observations that are actually never made!

Infinitely repeated trials or samplings never take place in the real world. So that cannot be a sound inductive basis for a science with aspirations of explaining real-world socio-economic processes, structures or events. It’s not tenable.

As David Salsburg once noted – in his lovely The Lady Tasting Tea - on probability theory:

[W]e assume there is an abstract space of elementary things called ‘events’ … If a measure on the abstract space of events fulfills certain axioms, then it is a probability. To use probability in real life, we have to identify this space of events and do so with sufficient specificity to allow us to actually calculate probability measurements on that space … Unless we can identify [this] abstract space, the probability statements that emerge from statistical analyses will have many different and sometimes contrary meanings.

Just as e. g. John Maynard Keynes and Nicholas Georgescu-Roegen, Salsburg is very critical of the way social scientists – including economists and econometricians – uncritically and without arguments have come to simply assume that one can apply probability distributions from statistical theory on their own area of research:

Probability is a measure of sets in an abstract space of events. All the mathematical properties of probability can be derived from this definition. When we wish to apply probability to real life, we need to identify that abstract space of events for the particular problem at hand … It is not well established when statistical methods are used for observational studies … If we cannot identify the space of events that generate the probabilities being calculated, then one model is no more valid than another … As statistical models are used more and more for observational studies to assist in social decisions by government and advocacy groups, this fundamental failure to be able to derive probabilities without ambiguity will cast doubt on the usefulness of these methods.

This importantly also means that if you cannot show that data satisfies all the conditions of the probabilistic nomological machine – including e. g. the distribution of the deviations corresponding to a normal curve – then the statistical inferences used, lack sound foundations.

In his great book Statistical Models and Causal Inference: A Dialogue with the Social Sciences David Freedman also touched on these fundamental problems, arising when you try to apply statistical models outside overly simple nomological machines like coin tossing and roulette wheels (emphasis added):

Layout 1Lurking behind the typical regression model will be found a host of such assumptions; without them, legitimate inferences cannot be drawn from the model. There are statistical procedures for testing some of these assumptions. However, the tests often lack the power to detect substantial failures. Furthermore, model testing may become circular; breakdowns in assumptions are detected, and the model is redefined to accommodate. In short, hiding the problems can become a major goal of model building.

Using models to make predictions of the future, or the results of interventions, would be a valuable corrective. Testing the model on a variety of data sets – rather than fitting refinements over and over again to the same data set – might be a good second-best … Built into the equation is a model for non-discriminatory behavior: the coefficient d vanishes. If the company discriminates, that part of the model cannot be validated at all.

Regression models are widely used by social scientists to make causal inferences; such models are now almost a routine way of demonstrating counterfactuals. However, the “demonstrations” generally turn out to depend on a series of untested, even unarticulated, technical assumptions. Under the circumstances, reliance on model outputs may be quite unjustified. Making the ideas of validation somewhat more precise is a serious problem in the philosophy of science. That models should correspond to reality is, after all, a useful but not totally straightforward idea – with some history to it. Developing appropriate models is a serious problem in statistics; testing the connection to the phenomena is even more serious …

In our days, serious arguments have been made from data. Beautiful, delicate theorems have been proved, although the connection with data analysis often remains to be established. And an enormous amount of fiction has been produced, masquerading as rigorous science.

And as if this wasn’t enough, one could — as we’ve seen — also seriously wonder what kind of “populations” these statistical and econometric models ultimately are based on. Why should we as social scientists – and not as pure mathematicians working with formal-axiomatic systems without the urge to confront our models with real target systems – unquestioningly accept Haavelmo’s “infinite population”, Fisher’s “hypothetical infinite population”, von Mises’s “collective” or Gibbs’s ”ensemble”?

Of course one could treat our observational or experimental data as random samples from real populations. I have no problem with that. But probabilistic econometrics does not content itself with that kind of populations. Instead it creates imaginary populations of “parallel universes” and assume that our data are random samples from that kind of populations.

But this is actually nothing else but hand-waving! And it is inadequate for real science. As David Freedman writes in Statistical Models and Causal Inference (emphasis added):

With this approach, the investigator does not explicitly define a population that could in principle be studied, with unlimited resources of time and money. The investigator merely assumes that such a population exists in some ill-defined sense. And there is a further assumption, that the data set being analyzed can be treated as if it were based on a random sample from the assumed population. These are convenient fictions … Nevertheless, reliance on imaginary populations is widespread. Indeed regression models are commonly used to analyze convenience samples … The rhetoric of imaginary populations is seductive because it seems to free the investigator from the necessity of understanding how data were generated.

In social sciences — including economics — it’s always wise to ponder C. S. Peirce’s remark that universes are not as common as peanuts …

Serially-correlated shocks and DSGE models (wonkish)

30 September, 2014 at 15:02 | Posted in Economics | 1 Comment

Now it is “dynamic stochastic general equilibrium” (DSGE) models inspired by the Lucas critique that have failed to predict or even explain the Great Recession of 2007–2009. More precisely, the implicit “explanations” based on these models are that the recession, including the millions of net jobs lost, was primarily due to large negative shocks to both technology and willingness to work … So can the reputation of modern macroeconomics be rehabilitated by simply modifying DSGE models to include a few more realistic shocks? …

A simple example helps illustrate for the uninitiated just how DSGE models work and why it should come as little surprise that they are largely inadequate for the task of explaining the Great Recession.

For this simple DSGE model, consider the following technical assumptions: i) an infinitely-lived representative agent with rational expectations and additive utility in current and discounted future log consumption and leisure; ii) a Cobb-Douglas aggregate production function with labor-augmenting technology; iii) capital accumulation with a fixed depreciation rate; and iv) a stochastic process for exogenous technology shocks …

wrong-tool-by-jerome-awIt is worth making two basic points about the setup. First, by construction, technology shocks are the only underlying source of fluctuations in this simple model. Thus, if we were to assume that U.S. real GDP was the literal outcome of this model, we would be assuming a priori that fluctuations in real GDP were ultimately due to technology. When faced with the Great Recession, this model would have no choice but to imply that technology shocks were somehow to blame. Second, despite the underlying role of technology, the observed fluctuations in real GDP can be divided into those that directly reflect the behavior of the exogenous shocks and those that reflect the endogenous capital accumulation in response to these shocks.

To be more precise about these two points, it is necessary to assume a particular process for the exogenous technology shocks. In this case, let’s assume technology follows a random walk with drift [and assuming a 100% depreciation rate of capital]…

So, with this simple DSGE model and for typical measures of the capital share, we have the implication that output growth follows an AR(1) process with an AR coefficient of about one third. This is notable given that such a time-series model does reasonably well as a parsimonious description of quarterly real GDP dynamics for the U.S. economy …

However, the rather absurd assumption of a 100% depreciation rate at the quarterly horizon would surely still have prompted a sharp question or two in a University of Chicago seminar back in the days. So, with this in mind, what happens if we consider the more general case?

binary-options-bunny-tophat-magicianUnfortunate-ly, for more realistic depreciation rates, we cannot solve the model analytically. Instead, taking a log-linearization around steady state, we can use standard methods to solve for output growth … This simple DSGE model is able to mimic the apparent AR(1) dynamics in real GDP growth. But it does so by assuming the exogenous technology shocks also follow an AR(1) process with an AR coefficient that happens to be the same as the estimated AR coefficient for output growth. Thus, the magic trick has been revealed: a rabbit was stuffed into the hat and then a rabbit jumped out of the hat …

Despite their increasing sophistication, DSGE models share one key thing in common with their RBC predecessors. After more than two decades of earnest promises to do better in the “future directions” sections of academic papers, they still have those serially-correlated shocks. Thus, the models now “explain” variables like real GDP, inflation, and interest rates as the outcome of more than just serially-correlated technology shocks. They also consider serially-correlated preference shocks and serially-correlated policy shocks …

James Morley

[h/t Brad DeLong & Merijn Knibbe]

Neoclassical economics and neoliberalism — two varieties of market fundamentalism

30 September, 2014 at 13:06 | Posted in Economics | Leave a comment

Oxford professor Simon Wren-Lewis had a post up some time ago commenting on traction gaining “attacks on mainstream economics”:

One frequent accusation … often repeated by heterodox economists, is that mainstream economics and neoliberal ideas are inextricably linked. Of course economics is used to support neoliberalism. Yet I find mainstream economics full of ideas and analysis that permits a wide ranging and deep critique of these same positions. The idea that the two live and die together is just silly.

Hmmm …

Silly? Maybe Wren-Lewis and other economists who want to enlighten themselves on the subject should take a look at this video:


Or maybe read this essay, where yours truly try to further analyze — much inspired by the works of Amartya Sen — what kind of philosophical-ideological-economic doctrine neoliberalism is, and why it so often comes natural for mainstream neoclassical economists to embrace neoliberal ideals.

den-dystra-vetenskapenOr — if you know some Swedish — you could take a look in this book on the connection between the dismal science and neoliberalism (sorry for shameless self-promotion).

Statistics — a question of life and death

30 September, 2014 at 12:40 | Posted in Statistics & Econometrics | 1 Comment

In 1997, Christopher, the eleven-week-old child of a young lawyer named Sally Clark, died in his sleep: an apparent case of Sudden Infant Death Sybdrome (SIDS) … One year later, Sally’s second child, Harry, also died, aged just eight weeks. Sally was arrested and accused of killing the children. She was convicted of murdering them, and in 1999 was given a life sentence …

71saNqrmn1L._SL1500_Now … I want to show how a simple mistaken assumption led to incorrect probabilities.

In this case the mistaken evidence came from Sir Roy Meadow, a paediatrician. Despite not being an expert statistician or probabilist, he felt able to make a statement about probabilities … He asserted that the probability of two SIDS deaths in a family like Sally Clark’s was 1 in 73 million. A probability as small as this suggests we might apply Borel’s law: we shouldn’t expect to see an improbable event …

Unfortunately, however, Meadow’s 1 in 73 million probability is based on a crucial assumption: that the deaths are independent; that one such death in a family does not make it more or less likely that there will be another …

Now … that assumption does seem unjustified: data show that if one SIDS death has occurred, then a subsequent child is about ten times more likely to die of SIDS … To arrive at a valid conclusion, we would have to compare the probability that the two children had been murdered with the probability that they had both died from SIDS … There is a factor-of-ten differeence between Meadow’s estimate and the estimate based on recognizing that SIDS events in the same family are not independent, and that difference shifts the probability from favouring homicide to favouring SIDS deaths …

Following widespread criticism of the misuse and indeed misunderstanding of statistical evidence, Sally Clark’s conviction was overturned, and she was released in 2003.

NAIRU — a failed metaphor legitimizing austerity policies

29 September, 2014 at 13:09 | Posted in Economics, Politics & Society | 1 Comment

In our extended NAIRU model, labor productivity growth is included in the wage bargaining process … The logical consequence of this broadening of the theoretical canvas has been that the NAIRU becomes endogenous itself and ceases to be an attractor — Milton Friedman’s natural, stable and timeless equilibrium point from which the system cannot permanently deviate. In our model, a deviation from the initial equilibrium affects not only wages and prices (keeping the rest of the system unchanged) but also demand, technology, workers’ motivation, and work intensity; as a result, productivity growth and ultimately equilibrium unemployment will change. There is in other, nothing natural or inescapable about equilibrium unemployment, as is Friedman’s presumption, following Wicksell; rather, the NAIRU is a social construct, fluctuating in response to fiscal and monetary policies and labor market interventions. Its ephemeral (rather than structural) nature may explain why the best economists working on the NAIRU have persistently failed to agree on how high the NAIRU actually is and how to estimate it.

Servaas Storm & C. W. M. Naastepad

jobbluffenNAIRU has been the subject of much heated discussion and debate in Sweden lately, after SVT, the Swedish national public TV broadcaster, aired a documentary on NAIRU and the fact that many politicians  — and economists — subscribe to the NAIRU story and its policy implication that attempts to promote full employment is doomed to fail, since governments and central banks can’t push unemployment below the critical NAIRU threshold without causing harmful runaway inflation.

One of  the main problems with NAIRU is that it essentially  is a timeless long-run equilibrium attractor to which actual unemployment (allegedly) has to adjust. But if that equilibrium is itself changing — and in ways that depend on the process of getting to the equilibrium —  well, then we can’t really be sure what that equlibrium will be without contextualizing unemployment in real historical time. And when we do, we will — as highlighted by Storm and Naastepad — see how seriously wrong we go if we omit demand from the analysis. Demand  policy has long-run effects and matters also for structural unemployment — and governments and central banks can’t just look the other way and legitimize their passivity re unemploy-ment by refering to NAIRU.

NAIRU does not hold water simply because it does not exist — and to base economic policy on such a weak theoretical and empirical construct is nothing short of writing out a prescription for self-inflicted economic havoc.

New Keynesianism — a macroeconomic cul-de-sac

28 September, 2014 at 15:55 | Posted in Economics | Leave a comment

Macroeconomic models may be an informative tool for research. But if practitioners of “New Keynesian” macroeconomics do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of “New Keynesian” macroeconomics. So far, I cannot really see that it has yielded very much in terms of realistic and relevant economic knowledge.

kKeynes basically argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Dynamic stochastic general euilibrium (DSGE) macroeconomists – including “New Keynesians” – has drawn the conclusion that the problem with unstable relations is to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of post (e. g. here and here), this, however, is a dead end.

Here we are getting close to the heart of darkness in “New Keynesian” macroeconomics. Where “New Keynesian” economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they have to put a blind eye on the emergent properties that characterize all open social systems – including the economic system. The interaction between animal spirits, trust, confidence, institutions etc., cannot be deduced or reduced to a question answerable on the individual level. Macroeconomic structures and phenomena have to be analyzed also on their own terms. And although one may easily agree with e.g. Paul Krugman’s emphasis on simple models, the simplifications used may have to be simplifications adequate for macroeconomics and not those adequate for microeconomics.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, even these microfoundations aren’t immutable. The “deep parameters” of “New Keynesian” DSGE models– “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

So I cannot concur with Paul Krugman, Mike Woodford, Greg Mankiw and other sorta-kinda “New Keynesians” when they more or less try to reduce Keynesian economics to “intertemporal maximization modified with sticky prices and a few other deviations”. ” As John Quiggin so aptly writes:

If there is one thing that distinguished Keynes’ economic analysis from that of his predecessors, it was his rejection of the idea of a unique full employment equilibrium to which a market economy will automatically return when it experiences a shock. Keynes argued that an economy could shift from a full-employment equilibrium to a persistent slump as the result of the interaction between objective macroeconomic variables and the subjective ‘animal spirits’ of investors and other decision-makers. It is this perspective that has been lost in the absorption of New Keynesian macro into the DSGE framework.

Sacrifice

27 September, 2014 at 21:12 | Posted in Varia | Leave a comment

 

Reforming economics curriculum

27 September, 2014 at 20:32 | Posted in Economics | Leave a comment

When the global economy crashed in 2008, the list of culprits was long, including dozy regulators, greedy bankers and feckless subprime borrowers. Now the dismal science itself is in the dock, with much soul-searching over why economists failed to predict the financial crisis. One of the outcomes of this debate is that economics students are demanding the reform of a curriculum they think sustains a selfish strain of capitalism and is dominated by abstract mathematics. It looks like the students will get their way. A new curriculum, designed at the University of Oxford, is being tried out. This is good news. …

heilThe typical economics course starts with the study of how rational agents interact in frictionless markets, producing an outcome that is best for everyone. Only later does it cover those wrinkles and perversities that characterise real economic behaviour, such as anti-competitive practices or unstable financial markets. As students advance, there is a growing bias towards mathematical elegance. When the uglier real world intrudes, it only prompts the question: this is all very well in practice but how does it work in theory? …

Fortunately, the steps needed to bring economics teaching into the real world do not require the invention of anything new or exotic. The curriculum should embrace economic history and pay more attention to unorthodox thinkers such as Joseph Schumpeter, Friedrich Hayek and – yes – even Karl Marx. Faculties need to restore links with other fields such as psychology and anthropology, whose insights can explain phenomena that economics cannot. Economics professors should make the study of imperfect competition – and of how people act in conditions of uncertainty – the starting point of courses, not an afterthought. …

Economics should not be taught as if it were about the discovery of timeless laws. Those who champion the discipline must remember that, at its core, it is about human behaviour, with all the messiness and disorder that this implies.

Financial Times

Next Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.