Good reasons to worry about inequalities

25 July, 2016 at 13:22 | Posted in Economics, Politics & Society | Leave a comment

Focussing upon inequality statistics … misses an important point. What matters is not just the level of income inequality, but how that inequality arose. A free market society in which high incomes arise from the free choices of consenting adults – as in Robert Nozick’s Wilt Chamberlain parable – might have the same Gini coefficient as a crony capitalist society. But they are two different things. A good reason to be worried about current inequality – even if it hasn’t changed – is that it is a symptom of market failures such as corporate welfare, regulatory capture or the implicit subsidy to banks.

trickle-downIn this context, what matters is not just inequalities of income but inequalities of power. Top footballers and top bankers might be earning similar sums, but one’s salary is the product of market forces and the other of a tax-payer subsidy. The freelancer on £30,000 who’s worrying where his next contract is coming from has similar income to the bullying middle managers who created intolerable working conditions at (for example) Sports Direct. But they have very different degrees of economic power. And the low income that results from having to take a lousy job where your wages are topped up by tax credits gives you much less power than the same income that would come from a basic income and the freer choice to take or leave a low wage job.

My point here is a simple one. There are very good reasons why we should worry about inequality – not just leftists but also rightists who want freer markets and “bourgeois” virtues. Focusing only upon the stability of the Gini coefficient is a form of statistical fetishism which overlooks important questions.

Chris Dillow

Lucas-Rapping and ‘New Keynesian’ models of unemployment

25 July, 2016 at 12:06 | Posted in Economics | Leave a comment

unemployed-thumbLucas and Rapping (1969) claim that cyclical increases in unemployment occur when workers quit their jobs because wages or salaries fall below expectations …

According to this explanation, when wages are unusually low, people become unemployed in order to enjoy free time, substituting leisure for income at a time when they lose the least income …

According to the theory, quits into unemployment increase during recessions, whereas historically quits decrease sharply and roughly half of unremployed workers become jobless because they are laid off … During the recession I studied, people were even afraid to change jobs because new ones might prove unstable and lead to unemployment …

If wages and salaries hardly ever fall, the intertemporal substitution theory is widely applicable only if the unemployed prefer jobless leisure to continued employment at their old pay. However, the attitude and circumstances of the unemployed are not consistent with their having made this choice …

In real business cycle theory, unemployment is interpreted as leisure optimally selected by workers, as in the Lucas-Rapping model. It has proved difficult to construct business cycle models consistent with this assumption and with real wage fluctuations as small as they are in reality, relative to fluctuations in employment.

Truman F. Bewley

This is, of course, only what you would expect of New Classical Chicago economists.

But sadly enough this extraterrestial view of unemployment is actually shared by so called New Keynesians, whose microfounded dynamic stochastic general equilibrium models cannot even incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfunded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility. In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

To Keynes this was self-evident. But obviously not so to New Classical and ‘New Keynesian’ economists.

Thorbjörn Fälldin — en gigant har gått ur tiden

25 July, 2016 at 10:48 | Posted in Politics & Society | Leave a comment

torbjörn

Thorbjörn Fälldin, Sveriges tidigare statsminister och ledare för Centerpartiet, dog i sitt hem i Ramvik, på lördagskvällen, 90 år gammal.

Under Thorbjörn Fälldins och föregångaren Gunnar Hedlunds tid representerade Centerpartiet fortfarande en genuin politisk kraft i vårt samhälle.

Efter Fälldin vände Centerpartiet blad och så småningom fick vi Maud Olofsson.

Med Olofsson såldes partiets själ ut och partiets kräftgång in i nyliberalism påbörjades.

Sen kom Annie Lööf. Och ett en gång respekterat parti kördes fullständigt i botten.

Att en nyliberal, tyckmyckentrutad, floskulös, Ayn Rand och Margaret Thatcher dyrkande broilerpolitiker idag kan sitta och styra över ett parti som letts av giganter som Hedlund och Fälldin är fullständigt monstruöst. Obegripligt. Och sorgligt.

New Keynesian unemployment — a paid vacation essentially!

24 July, 2016 at 10:45 | Posted in Economics | Leave a comment

Franco Modigliani famously quipped that he did not think that unemployment during the Great Depression should be described, in an economic model, as a “sudden bout of contagious laziness”. Quite. For the past thirty years we have been debating whether to use classical real business cycle models (RBC), or their close cousins, modern New Keynesian (NK) models, to describe recessions. In both of these models, the social cost of persistent unemployment is less than a half a percentage point of steady state consumption.

0a7fb63c47d95f3138a81e711dabe9d3959138340aa3e78d26336fd2fab0f6b9What does that mean? Median US consumption is roughly $30,000 a year. One half of one percent of this is roughly 50 cents a day. A person inhabiting one of our artificial model RBC or NK model worlds, would not be willing to pay more than 50 cents a day to avoid another Great Depression. That is true of real business cycle models. It is also true of New Keynesian models …

That’s why I eschew NK and RBC models. They are both wrong. The high unemployment that follows a financial crisis is not the socially efficient response to technology shocks. And the slow recovery from a financial melt-down has nothing to do with the costs of reprinting menus that underpins the models of NK economists. It is a potentially permanent failure of private agents to coordinate on an outcome that is socially desirable.

Roger Farmer

wpid-mmb9qajq9swpi8xxy76a

In the basic DSGE models used by both New Classical and ‘New Keynesian’ macroeconomists, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its ‘equilibrium value,’ the representative agent adjust her labour supply, so that when the real wage is higher than its ‘equilibrium value,’ labour supply is increased, and when the real wage is below its ‘equilibrium value,’ labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be — a kind of prolonged vacation.

The D WordAlthough this picture of unemployment as a kind of self-chosen optimality, strikes most people as utterly ridiculous, there are also, unfortunately, a lot of neoclassical economists out there who still think that price and wage rigidities are the prime movers behind unemployment. DSGE models basically explains variations in employment (and a fortiori output) with assuming nominal wages being more flexible than prices – disregarding the lack of empirical evidence for this rather counterintuitive assumption.

Lowering nominal wages would not  clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. It would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen as a general substitute for an expansionary monetary or fiscal policy. And even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

The classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong. Flexible wages would probably only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labour market.

Obviously it’s rather embarrassing that the kind of DSGE models ‘modern’ macroeconomists use cannot incorporate such a basic fact of reality as involuntary unemployment. Of course, working with representative agent models, this should come as no surprise. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

And as if this is nonsense economics is not enough, in New Classical and ‘New Keynesian’ macroeconomists DSGE models increases in government spending leads to a drop in private consumption!

How on earth does one arrive at such as bizarre view?

In the most basic mainstream proto-DSGE models one often assumes that governments finance current expenditures with current tax revenues.  This will have a negative income effect on the households, leading — rather counterintuitively — to a drop in private consumption although both employment an production expands. This mechanism also holds when the (in)famous Ricardian equivalence is added to the models.

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

Why?

In the standard neoclassical consumption model — used in DSGE macroeconomic modeling — people are basically portrayed as treating time as a dichotomous phenomenon  today and the future — when contemplating making decisions and acting. How much should one consume today and how much in the future? Facing an intertemporal budget constraint of the form

ct + cf/(1+r) = ft + yt + yf/(1+r),

where ct is consumption today, cf is consumption in the future, ft is holdings of financial assets today, yt is labour incomes today, yf is labour incomes in the future, and r is the real interest rate, and having a lifetime utility function of the form

U = u(ct) + au(cf),

where a is the time discounting parameter, the representative agent (consumer) maximizes his utility when

u'(ct) = a(1+r)u'(cf).

This expression – the Euler equation – implies that the representative agent (consumer) is indifferent between consuming one more unit today or instead consuming it tomorrow. Typically using a logarithmic function form – u(c) = log c – which gives u'(c) = 1/c, the Euler equation can be rewritten as

1/ct = a(1+r)(1/cf),

or

cf/ct = a(1+r).

This importantly implies that according to the neoclassical consumption model changes in the (real) interest rate and consumption move in the same direction. And — it also follows that consumption is invariant to the timing of taxes, since wealth — ft + yt + yf/(1+r) — has to be interpreted as present discounted value net of taxes. And so, according to the assumption of Ricardian equivalence, the timing of taxes does not affect consumption, simply because the maximization problem as specified in the model is unchanged. As a result — households cut down on their consumption when governments increase their spendings. Mirabile dictu!

out of the fryingMacroeconomic models have to abandon Ricardian equivalence nonsense. But replacing it with “overlapping generations” and “infinite-horizon” models — is– in terms of realism and relevance — just getting out of the frying pan into the fire. All unemployment is still voluntary. Intertemporal substitution between labour and leisure is still ubiquitous. And the specification of the utility function is still hopelessly off the mark from an empirical point of view.

As one Nobel laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter 

And as one economics blogger has it:

New Classical and ‘New Keynesian’ DSGE modeling is taught in every graduate school in the country. It is also sheer nonsense.

Lars P Syll, twitter 

Economics vs. reality

24 July, 2016 at 09:41 | Posted in Economics | 1 Comment

Economics_Versus_Reality-e1431153441673Denicolò and Zanchettin, in an article published by the prestigious Economic Journal, claim to have shown among other things that “stronger patent protection may reduce innovation and growth.” As a prelude to forty pages of mathematics, they state of their model, “The economy is populated by L identical, infinitely lived, individuals … There is a unique final good in the economy that can be consumed, used to produce intermediate goods, or used in research …” Not only are all the people in this model world identical and immortal, they only produce a single product. The product has properties that are entirely unreal—not so much science fiction as pure magic. The conclusion may be justified, or not; but the idea that a model so remote from reality can be used to make public policy recommendations is, to anyone but a fully certified neoclassical economist, staggering.

The mathematization of economics since WW II has made mainstream — neoclassical — economists more or less obsessed with formal, deductive-axiomatic models. Confronted with the critique that they do not solve real problems, they  often react as Saint-Exupéry‘s Great Geographer, who, in response to the questions posed by The Little Prince, says that he is too occupied with his scientific work to be be able to say anything about reality. Confronting economic theory’s lack of relevance and ability to tackle real probems, one retreats into the wonderful world of economic models. One goes in to the “shack of tools” — as my old mentor Erik Dahmén used to say — and stays there. While the economic problems in the world around us steadily increase, one is rather happily playing along with the latest toys in the mathematical toolbox.

Modern mainstream economics is sure very rigorous — but if it’s rigorously wrong, who cares?

Instead of making formal logical argumentation based on deductive-axiomatic models the message, I think we are better served by economists who more  than anything else try to contribute to solving real problems. And then the motto of John Maynard Keynes is more valid than ever:

It is better to be vaguely right than precisely wrong

Racial bias in police shooting

23 July, 2016 at 18:23 | Posted in Politics & Society, Statistics & Econometrics | 3 Comments

roland-fryerRoland Fryer, an economics professor at Harvard University, recently published a working paper at NBER on the topic of racial bias in police use of force and police shootings. The paper gained substantial media attention – a write-up of it became the top viewed article on the New York Times website. The most notable part of the study was its finding that there was no evidence of racial bias in police shootings, which Fryer called “the most surprising result of [his] career”. In his analysis of shootings in Houston, Texas, black and Hispanic people were no more likely (and perhaps even less likely) to be shot relative to whites.

Fryer’s analysis is highly flawed, however … Fryer was not comparing rates of police shootings by race. Instead, his research asked whether these racial differences were the result of “racial bias” rather than merely “statistical discrimination”. Both terms have specific meanings in economics. Statistical discrimination occurs when an individual or institution treats people differently based on racial stereotypes that ‘truly’ reflect the average behavior of a racial group. For instance, if a city’s black drivers are 50% more likely to possess drugs than white drivers, and police officers are 50% more likely to pull over black drivers, economic theory would hold that this discriminatory policing is rational …

Once explained, it is possible to find the idea of “statistical discrimination” just as abhorrent as “racial bias”. One could point out that the drug laws police enforce were passed with racially discriminatory intent, that collectively punishing black people based on “average behavior” is wrong, or that – as a self-fulfilling prophecy – bias can turn into statistical discrimination (if black people’s cars are searched more thoroughly, for instance, it will appear that their rates of drug possession are higher) …

Even if one accepts the logic of statistical discrimination versus racial bias, it is an inappropriate choice for a study of police shootings. The method that Fryer employs has, for the most part, been used to study traffic stops and stop-and-frisk practices. In those cases, economic theory holds that police want to maximize the number of arrests for the possession of contraband (such as drugs or weapons) while expending the fewest resources. If they are acting in the most cost-efficient, rational manner, the officers may use racial stereotypes to increase the arrest rate per stop. This theory completely falls apart for police shootings, however, because officers are not trying to rationally maximize the number of shootings …

Economic theory aside, there is an even more fundamental problem with the Houston police shooting analysis. In a typical study, a researcher will start with a previously defined population where each individual is at risk of a particular outcome. For instance, a population of drivers stopped by police can have one of two outcomes: they can be arrested, or they can be sent on their way. Instead of following this standard approach, Fryer constructs a fictitious population of people who are shot by police and people who are arrested. The problem here is that these two groups (those shot and those arrested) are, in all likelihood, systematically different from one another in ways that cannot be controlled for statistically … Properly interpreted, the actual result from Fryer’s analysis is that the racial disparity in arrest rates is larger than the racial disparity in police shootings. This is an unsurprising finding, and proves neither a lack of bias nor a lack of systematic discrimination.

Justin Feldman

What makes most econometric models invalid

23 July, 2016 at 10:42 | Posted in Statistics & Econometrics | Leave a comment

The assumption of additivity and linearity means that the outcome variable is, in reality, linearly related to any predictors … and that if you have several predictors then their combined effect is best described by adding their effects together …

catdogThis assumption is the most important because if it is not true then even if all other assumptions are met, your model is invalid because you have described it incorrectly. It’s a bit like calling your pet cat a dog: you can try to get it to go in a kennel, or to fetch sticks, or to sit when you tell it to, but don’t be surprised when its behaviour isn’t what you expect because even though you’ve called it a dog, it is in fact a cat. Similarly, if you have described your statistical model inaccurately it won’t behave itself and there’s no point in interpreting its parameter estimates or worrying about significance tests of confidence intervals: the model is wrong.

Andy Field

Economics — a kind of brain damage …

23 July, 2016 at 10:12 | Posted in Economics | 1 Comment


(h/t Nanikore)

Is Paul Romer nothing but a neo-colonial Washington Consensus libertarian?

22 July, 2016 at 17:42 | Posted in Economics | 2 Comments

On Monday the World Bank made it official that Paul Romer will be the new chief economist. This nomination can be seen as a big step back toward the infamous Washington Consensus, which World Bank and IMF seemed to have left behind. This is true, even though Paul Romer has learned quite well to hide the market fundamentalist and anti-democratic nature of his pet idea – charter cities – behind a veil of compassionate wording …

the_libertarian_plot_sticker-r61d02bbe203143f79e2ea3e1d5bd79ba_v9i40_8byvr_512Since about 2009 he has been promoting so-called charter cities as a model for development … His proposal amounts to declaring enlightened colonialism to be the best (or even only) way toward development of poor countries, and a good substitute to development aid …

Romer has in mind a version of the Hong Kong case, without the coercion. His cities are supposed to be extreme forms of free enterprise zones which some developing countries, including China, have been experimenting with for quite a while. The idea of the latter is to attract foreign investors by exempting them from certain regulations, duties etc. His charters cities go further. They build on the wholesale abrogation of all laws of the respective country. For countries with dysfunctional public institutions he suggested that they lease out the regions, where these charter cities are to be build, long-term to a consortium of enlightend industrial countries, which would do the management. What the British extracted at gunpoint from China, developing countries are expected to give voluntarily today. A World Bank manager commented on the idea in 2010 on the blog of the World Bank by quoting a magazine article, which called it “not only neo-medieval, but also neo-colonial”.

The libertarian spirit of the idea of the man who will be the World Bank’s chief economist from September reminds of the Washington Consensus that ruled into the 1990s. This is a name for the ideological position, enforced by World Bank and IMF, that the best and only way to development is the scrapping of government regulation and giving companies a maximum of freedom to go about their business.

Norbert Häring

Economics laws — the ultimate reduction to triviality

22 July, 2016 at 16:27 | Posted in Economics | Leave a comment

truth_and_lies_t-662x272What we discover is that the cash value of these laws lies beneath the surface — in the extent to which they approximate the behaviour of real gases or substances, since such substances do not exist in the world …

Notice that we are here regarding it as grounds for complaint that such claims are ‘reduced to the status of definitions’ … Their truth is obtained at a price, namely that they cease to tell us about this particular world and start telling us about the meaning of words instead …

The ultimate reduction to triviality makes the claim definitionally true, and obviously so, in which case it’s worth nothing to those who already know the language …

Michael Scriven

One of the main cruxes of economics laws — and regularities — is that they only hold ceteris paribus. That fundamentally means that these laws/regularites only hold when the right conditions are at hand for giving rise to them. Unfortunately, from an empirical point of view, those conditions are only at hand in artificially closed nomological models purposely designed to give rise to the kind of regular associations that economists want to explain. But, really, since these laws/regularities do not exist outside these ‘socio-economic machines,’ what’s the point in constructing these non-existent laws/regularities? When the almost endless list of narrow and specific assumptions necessary to allow the ‘rigorous’ deductions are known to be at odds with reality, what good do these models do?

Take ‘The Law of Demand.’

Although it may (perhaps) be said that neoclassical economics had succeeded in establishing The Law – when the price of a commodity falls, the demand for it will increase — for single individuals, it soon turned out, in the Sonnenschein-Mantel-Debreu theorem, that it wasn’t possible to extend The Law to apply on the market level, unless one made ridiculously unrealistic assumptions such as individuals all having homothetic preferences – which actually implies that all individuals have identical preferences.

This could only be conceivable if there was in essence only one actor – the (in)famous representative actor. So, yes, it was possible to generalize The Law of Demand – as long as we assumed that on the aggregate level there was only one commodity and one actor. What generalization! Does this sound reasonable? Of course not. This is pure nonsense!

How has neoclassical economics reacted to this devastating findig? Basically by looking the other way, ignoring it and hoping that no one sees that the emperor is naked.

Modern mainstream neoclassical textbooks try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And – worse still – something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Of course one could say that it is too difficult on undergraduate levels to show why the procedure is right and to defer it to masters and doctoral courses. It could justifiably be reasoned that way – if what you teach your students is true, if The Law of Demand is generalizable to the market level and the representative actor is a valid modeling abstraction! But in this case it’s demonstrably known to be false, and therefore this is nothing but a case of scandalous intellectual dishonesty. It’s like telling your students that 2 + 2 = 5 and hope that they will never run into Peano’s axioms of arithmetics.

As Hans Albert has it:

albert1The neoclassical style of thought – with its emphasis on thought experiments, reflection on the basis of illustrative examples and logically possible extreme cases, its use of model construction as the basis of plausible assumptions, as well as its tendency to decrease the level of abstraction, and similar procedures – appears to have had such a strong influence on economic methodology that even theoreticians who strongly value experience can only free themselves from this methodology with difficulty …

Clearly, it is possible to interpret the ‘presuppositions’ of a theoretical system … not as hypotheses, but simply as limitations to the area of application of the system in question. Since a relationship to reality is usually ensured by the language used in economic statements, in this case the impression is generated that a content-laden statement about reality is being made, although the system is fully immunized and thus without content. In my view that is often a source of self-deception in pure economic thought …

Next Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.