Stephen Williamson on involuntary unemployment and search models

16 mars, 2015 kl. 20:21 | Publicerat i Economics | 4 kommentarer

An employment contract, like marriage, requires that two people agree. If one would-be party to the contract does not agree, then it doesn’t happen. We could talk about people being involuntarily single, I suppose, but the relevant behavior we are interested in, I think, is the search behavior. contractAn unemployed person is engaged in active search for employment. Old-fashioned ways of thinking about that – in competitive equilibrium environments – didn’t get very far, as those models are not equipped to think about search. Search models allow one to think about unemployment in a useful way. Those models are more enlightening about the determinants of unemployment, and how governments might help labor markets work more efficiently. But people in those models are always making choices. The unemployed are people who choose to search for work because they think there is light at the end of the tunnel – potentially they will find a job that will make them better off. So voluntary or involuntary doesn’t enter into the discussion.

Stephen Williamson

Now this seems to me as bad and wrong-headed a defense of Lucas denial of involuntary unemployment as the one given a couple of years ago by Michel DeVroey:

What explains the difficulty of constructing a theory of involuntary unemployment? Is it, as argued by Lucas, that the “thing” to be explained doesn’t exist, or is it due to some deeply embedded premise of economic theory? My own view tilts towards the latter. Economic theory is concerned with fictitious parables. The premises upon which it is based have the advantage of allowing tractable, rigorous theorising, but the price of this is that important facts of life are excluded from the theoretical universe. Non-chosen outcomes is one of them. The underlying reason lies in the trade technology and information assumptions upon which both the Walrasian and the Marshallian (and the neo-Walrasian and neo-Marshallian) approaches are based. This is a central conclusion of my inquiry: the stumbling block to the introduction of involuntary unemployment lies in the assumptions about trade technology that are usually adopted in economic theory.

Foregoing the involuntary unemployment claim may look like a high price to pay, particularly if it is admitted that good reasons exist for believing in its real world relevance. But would its abandonment really be so dramatic? …

First of all, the elimination of this concept would only affect the theoretical sphere. Drawing conclusions from this sphere about the real world would be a mistake. No jumps should be made from the world of theory to the real world, or vice-versa … The fact that solid arguments can be put forward as to its real world existence is not a sufficient condition to give involuntary unemployment theoretical legitimacy.

Michel De Vroey

nonsequitur090111

I have to admit of being totally unim-pressed by this rather defeatist methodological stance. Is it really a feasible methodology for economists to make  a sharp divide between theory and reality, and then treat the divide as something recommendable and good? I think not.

Models and theories should — if they are to be of any real interest — have to look to the world. Being able to construct ”fictitious parables” or build models of a ”credible world,” is not enough. No matter how many convoluted refinements of concepts made in the theory or model, if they do not result in ”things” similar to reality in the appropriate respects, such as structure, isomorphism etc, the surrogate system becomes a substitute system — and why should we care about that? Science has to have higher aspirations.

Mainstream economic theory today is in the story-telling business whereby economic theorists create mathematical make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the theory or model and lose sight of reality. Insisting — like De Vroey — that ”no jumps should be made from the world of theory to the real world, or vice-versa” is an untenable methodological position.

Now, Williamson refers explicitly to search models as allowing one ”to think about unemployment in a useful way.” Buyt what kind of theory are we talking about here? Taking a methodological look at search theory I think its shortcomings becomes open to everyone.

When criticizing the basic (DSGE) workhorse model for its inability to explain involuntary unemployment, its defenders maintain that later elaborations — especially newer search models — manage to do just that. However, one of the more conspicuous problems with those ”solutions,” is that they — as e.g. Pissarides’ ”Loss of Skill during Unemployment and the Persistence of Unemployment Shocks” QJE (1992) — are as a rule constructed without seriously trying to warrant that the model immanent assumptions and results are applicable in the real world. External validity is more or less a non-existent problematique sacrificed on the altar of model derivations. This is not by chance. For how could one even imagine to empirically test assumptions such as Pissarides’ ”model 1″ assumptions of reality being adequately represented by ”two overlapping generations of fixed size”, ”wages determined by Nash bargaining”, ”actors maximizing expected utility”,”endogenous job openings”, ”jobmatching describable by a probability distribution,” without coming to the conclusion that this is — in terms of realism and relevance — nothing but nonsense on stilts?

In that perspective I do think its pretty obvious that the kind of search models Williamson has in mind don’t take us very far in thinking about unemployment ”in a useful way.”

Lecturing the Senate Budget Committee

15 mars, 2015 kl. 15:56 | Publicerat i Economics, Politics & Society | 8 kommentarer

blythWatch Mark Blyth give the Senate Budget Committee a well-earned lecture on public debt here. Absolutely fabulous!

Varoufakis’ unbelievable blunder

15 mars, 2015 kl. 15:28 | Publicerat i Varia | 4 kommentarer

Perhaps at another time and place, appearing in the Paris Match magazine with his wife in a lifestyle photo spread, wouldn’t be a big deal for Greek Finance Minister Yanis Varoufakis. But these days? Hmm …
 
varoufakis_twitter-630x400
 

And thus the native hue of resolution
Is sicklied o’er with the pale cast of thought

gre
 
Added March 16: According to Times Of Change, Varoufakis has now stated that he regrets the photo-shoot and ”would have liked for that shoot to not have happened.”
Regret noted.
End of story.

David Andolfatto and the Chicago dismissal of ‘involuntary unemployment’

15 mars, 2015 kl. 13:42 | Publicerat i Economics | 3 kommentarer

David Andolfatto doesn’t like it when I say that some unemployment is involuntary. Here is my response:

David

I am happy with the way you characterize my beliefs in the first paragraph of your blog. Unemployment is clearly not Pareto optimal.  Everything you say after that is at best misleading and at worst dismissive of everything we (at least some of us) learned from Keynes.

types-of-unemploymentThe idea of involuntary unemployment was introduced by Keynes in the General Theory. But you already knew that. It is defined as a situation where (in modern language) the ratio of the marginal disutility of work to the marginal utility of consumption is not equal to the real wage. That seems a pretty accurate description of the equilibrium outcome of labor search models.

Bob Lucas cast a spell over the profession in a series of papers in the 1970s. You are accurately summarizing Bob’s view. That view was tied to a three decade long campaign by economists predominately located in Chicago, Minnesota and Rochester (at the time) to discredit Keynesian economics. Tom Sargent reputedly advised his students not to read the General Theory. That was a tragic mistake and we are still suffering from the consequences.

You are right to assert that the important distinction is between equilibria that are Pareto optimal and those that are not. You are wrong to assert that the term ‘involuntary unemployment’ has no useful meaning.

I accept your categorization of the allocation of time between three competing ends. Every family, and every member of that family, chooses every day whether they will choose to participate in the labor force. As long as they are in the labor force, they may be employed or unemployed. Those who are unemployed do not choose that state. They must wait for a job offer to appear. In some states, that job offer may take a couple of days to arrive. In others, it may take a couple of years. The activity of waiting for a job, even when it involves active search, can meaningfully be called involuntary unemployment.

The dismissal of ‘involuntary unemployment’ from the lexicon of the modern economist was introduced as part of a deliberate attack on Keynesian economics. It is time to roll back that attack. As I have shown here, ‘involuntary’ unemployment is a useful way of distinguishing unemployment that is part of a social optimum, from unemployment that is not.

Roger Farmer

 

There are unfortunately a lot of mainstream economists out there who still think that price and wage rigidities are the prime movers behind unemployment. What is even worse — I’m totally gobsmacked every time I come across this utterly ridiculous misapprehension — is that some of them even think that these rigidities are the reason John Maynard Keynes gave for the high unemployment of the Great Depression. This is of course pure nonsense. For although Keynes in General Theory devoted substantial attention to the subject of wage and price rigidities, he certainly did not hold this view.

Since unions/workers, contrary to classical assumptions, make wage-bargains in nominal terms, they will – according to Keynes – accept lower real wages caused by higher prices, but resist lower real wages caused by lower nominal wages. However, Keynes held it incorrect to attribute “cyclical” unemployment to this diversified agent behaviour. During the depression money wages fell significantly and – as Keynes noted – unemployment still grew. Thus, even when nominal wages are lowered, they do not generally lower unemployment.

In any specific labour market, lower wages could, of course, raise the demand for labour. But a general reduction in money wages would leave real wages more or less unchanged. The reasoning of the ”classical” economists was, according to Keynes, a flagrant example of the “fallacy of composition.” Assuming that since unions/workers in a specific labour market could negotiate real wage reductions via lowering nominal wages, unions/workers in general could do the same, the classics confused micro with macro.

Lowering nominal wages could not – according to Keynes – clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. But to Keynes it would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen by Keynes as a general substitute for an expansionary monetary or fiscal policy.

Even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

So, what Keynes actually did argue in General Theory, was that the ”classical” proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong.

To Keynes, flexible wages would only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labor market.

The classical school [maintains that] while the demand for labour at the existing money-wage may be satisfied before everyone willing to work at this wage is employed, this situation is due to an open or tacit agreement amongst workers not to work for less, and that if labour as a whole would agree to a reduction of money-wages more employment would be forthcoming. If this is the case, such unemployment, though apparently involuntary, is not strictly so, and ought to be included under the above category of ‘voluntary’ unemployment due to the effects of collective bargaining, etc …
The classical theory … is best regarded as a theory of distribution in conditions of full employment. So long as the classical postulates hold good, unemploy-ment, which is in the above sense involuntary, cannot occur. Apparent unemployment must, therefore, be the result either of temporary loss of work of the ‘between jobs’ type or of intermittent demand for highly specialised resources or of the effect of a trade union ‘closed shop’ on the employment of free labour. Thus writers in the classical tradition, overlooking the special assumption underlying their theory, have been driven inevitably to the conclusion, perfectly logical on their assumption, that apparent unemployment (apart from the admitted exceptions) must be due at bottom to a refusal by the unemployed factors to accept a reward which corresponds to their marginal productivity …

Obviously, however, if the classical theory is only applicable to the case of full employment, it is fallacious to apply it to the problems of involuntary unemployment – if there be such a thing (and who will deny it?). The classical theorists resemble Euclidean geometers in a non-Euclidean world who, discovering that in experience straight lines apparently parallel often meet, rebuke the lines for not keeping straight – as the only remedy for the unfortunate collisions which are occurring. Yet, in truth, there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics. We need to throw over the second postulate of the classical doctrine and to work out the behaviour of a system in which involuntary unemployment in the strict sense is possible.

J M Keynes General Theory

 

On the optimum level of public debt

14 mars, 2015 kl. 09:03 | Publicerat i Economics | 4 kommentarer

The failure of successive administrations in most developed countries to embark on any vigorous policy aimed at bringing down unconscionably high levels of unemployment has been due in no small measure to a ‘viewing with alarm’ of the size of the national debts, often alleged to be already excessive, or at least threatening to become so, and  by ideologically urged striving toward ‘balanced’ government budgets without any consideration of whether such debts and deficits are or threaten to become excessive in terms of some determinable impact on the real general welfare. darling-let-s-get-deeply-into-debtIf they are examined in the light of their impact on welfare, however, they can usually be shown to be well below their optimum levels, let alone at levels that could have dire consequences.

To view government debts in terms of the ‘functional finance’ concept introduced by Abba Lerner, is to consider their role in the macroeconomic balance of the economy. In simple, bare bones terms, the function of government debts that is significant for the macroeconomic health of an economy is that they provide the assets into which individuals can put whatever accumulated savings they attempt to set aside in excess of what can be wisely invested in privately owned real assets. A debt that is smaller than this will cause the attempted excess savings, by being reflected in a reduced level of consumption outlays, to be lost in reduced real income and increased unemployment.

William Vickrey

The committee to save the world

13 mars, 2015 kl. 16:54 | Publicerat i Politics & Society | 2 kommentarer

comSaving the world? The moral-ethical calibre of at least two of these guys makes me wonder how one could even entertain such a bizarre idea.

Re Greenspan, yours truly can’t but agree with Paul Krugman — he isn’t just a bad economist, he’s a bad person. What else can one think of a person that considers Ayn Rand — with the ugliest psychopathic philosophy the postwar world has produced — one of the great thinkers of the 20th century? A person that even co-edited a book with her — maintaining that unregulated capitalism is a “superlatively moral system”. A person that in his memoirs tries to reduce his admiration for Rand to a youthful indiscretion — but who actually still today can’t be described as anything else than a loyal Randian disciple.

And Summers — well an economist who writes that he has

always thought that under-populated countries in Africa are vastly UNDER-polluted, their air quality is probably vastly inefficiently low compared to Los Angeles or Mexico City

and that

only the lamentable facts that so much pollution is generated by non-tradable industries (transport, electrical generation) and that the unit transport costs of solid waste are so high prevent world welfare enhancing trade in air pollution and waste.

certainly isn’t on my toplist of would-be world saviours …

Meeting Kieslowski

13 mars, 2015 kl. 15:39 | Publicerat i Varia | Kommentarer inaktiverade för Meeting Kieslowski

 

Ricardian equivalence — a hopelessly unrealistic curiosum

13 mars, 2015 kl. 09:28 | Publicerat i Economics | Kommentarer inaktiverade för Ricardian equivalence — a hopelessly unrealistic curiosum

Barro (1974) has shown that, given perfect foresight, debt neutrality will obtain when three conditions are met: (a) private agents can lend and borrow on the same terms as the government, (b) private agents are able and willing to undo any government scheme to redistribute spending power between generations, and (c) all taxes and transfer payments are lump sum, by which we mean that their basis of assessment is independent of private agents’ decisions about production, labour supply, consumption, or asset accumulation. Under these extreme assumptions, any change in government financing (government saving or dissaving) is offset one-for-one by a corresponding change in private saving itself financed by the accompanying tax changes.

aatical-economics-are-mere-concoctions-as-imprecise-as-the-john-maynard-keynes-243582All three assumptions are of course hopelessly unrealistic. Condition (a) fails because credit rationing, liquidity constraints, large spreads between lending and borrowing rates of interest, and private borrowing rates well in excess of those enjoyed by the government are an established fact in most industrial countries. These empirical findings are underpinned by the new and burgeoning theoretical literature on asymmetric information and the implications of moral hazard and adverse selection for private financial marketsl1; and by game-theoretic insights of how active competition in financial markets can yield credit rationing as the equilibrium outcome.

Condition (b) fails because it requires either that agents must live for ever or else effectively do so through the account they take of their children and parents in making gifts and bequests. In reality, private decision horizons are finite and frequently quite short …

Condition (c) fails because in practice taxes and subsidies are rarely lump sum …

I conclude that the possible neutrality of public debt and deficits is little more than a theoretical curiosum.

Willem Buiter

The Larry Summers Memo

12 mars, 2015 kl. 12:51 | Publicerat i Economics, Politics & Society | 3 kommentarer

The Memo

DATE: December 12, 1991
TO: Distribution
FR: Lawrence H. Summers
Subject: GEP

larry-summers-is-sleepy-three-thumb-480x350‘Dirty’ Industries: Just between you and me, shouldn’t the World Bank be encouraging MORE migration of the dirty industries to the LDCs [Less Developed Countries]? I can think of three reasons:

1) The measurements of the costs of health impairing pollution depends on the foregone earnings from increased morbidity and mortality. From this point of view a given amount of health impairing pollution should be done in the country with the lowest cost, which will be the country with the lowest wages. I think the economic logic behind dumping a load of toxic waste in the lowest wage country is impeccable and we should face up to that.

2) The costs of pollution are likely to be non-linear as the initial increments of pollution probably have very low cost. I’ve always though that under-populated countries in Africa are vastly UNDER-polluted, their air quality is probably vastly inefficiently low compared to Los Angeles or Mexico City. Only the lamentable facts that so much pollution is generated by non-tradable industries (transport, electrical generation) and that the unit transport costs of solid waste are so high prevent world welfare enhancing trade in air pollution and waste.

3) The demand for a clean environment for aesthetic and health reasons is likely to have very high income elasticity. The concern over an agent that causes a one in a million change in the odds of prostrate cancer is obviously going to be much higher in a country where people survive to get prostrate cancer than in a country where under 5 mortality is is 200 per thousand. Also, much of the concern over industrial atmosphere discharge is about visibility impairing particulates. These discharges may have very little direct health impact. Clearly trade in goods that embody aesthetic pollution concerns could be welfare enhancing. While production is mobile the consumption of pretty air is a non-tradable.

The problem with the arguments against all of these proposals for more pollution in LDCs (intrinsic rights to certain goods, moral reasons, social concerns, lack of adequate markets, etc.) could be turned around and used more or less effectively against every Bank proposal for liberalization.

Postscript

After the memo became public in February 1992, Brazil’s then-Secretary of the Environment Jose Lutzenburger wrote back to Summers: ”Your reasoning is perfectly logical but totally insane… Your thoughts [provide] a concrete example of the unbelievable alienation, reductionist thinking, social ruthlessness and the arrogant ignorance of many conventional ‘economists’ concerning the nature of the world we live in… If the World Bank keeps you as vice president it will lose all credibility. To me it would confirm what I often said… the best thing that could happen would be for the Bank to disappear.” Sadly, Mr. Lutzenburger was fired shortly after writing this letter.

Mr. Summers, on the other hand, was appointed the U.S. Treasury Secretary on July 2nd, 1999, and served through the remainder of the Clinton Admistration. Afterwards, he was named president of Harvard University.

The Whirled Bank Group

Understanding public debts and budget deficits

12 mars, 2015 kl. 12:30 | Publicerat i Economics, Politics & Society | 1 kommentar

Does public debt restrict growth?

12 mars, 2015 kl. 12:10 | Publicerat i Economics | 1 kommentar

 

Bob Pollin was — together with people like Fred Lee and Axel Leijonhufvud — one of those who made a visit to University of California such a great experience back in the beginning of the 1980s for a young Swedish research stipendiate in economics. Yours truly had the great pleasure and privelege of having Bob as teacher. He was a great inspiration at the time — and he still is.

Brad DeLong on ‘silly’ Robert Lucas

11 mars, 2015 kl. 10:14 | Publicerat i Economics | 2 kommentarer

S1365100513000345a_abstractRobert Lucas: Economics tries to… make predictions about the way… say, 280 million people are going to respond if you change something in the tax structure, something in the inflation rate, or whatever…. Kahnemann and Tversky haven’t even gotten to two people; they can’t even tell us anything interesting about how a couple that’s been married for ten years splits or makes decisions about what city to live in–let alone 250 million. This is like saying that we ought to build it up from knowledge of molecules or–no, that won’t do either, because there are a lot of subatomic particles…. We’re not going to build up useful economics in the sense of things that help us think about the policy issues that we should be thinking about starting from individuals and, somehow, building it up from there. Behavioral economics should be on the reading list…. But to think of it as an alternative to what macroeconomics or public finance people are doing or trying to do… not in my lifetime…

I do not think Lucas understands how silly he sounds.

I do not think Lucas understands that when he builds his models he is aggregating up the behavior of 310 million American individuals, having made certain assumptions about what things cancel out when that aggregation is made.

Lucas does say that economists definitely should not:
•Model humans as they actually behave.
•Model their economic interactions as they actually happen.
•Aggregate up.
•And look for emergent patterns to fit to the data.

Instead Lucas says economists should:
•Do what macroeconomists currently do.
•This is, assume one infinitely-lived hyper-rational representative price-taking agent.
•Assume that all equilibrium-selection and coordination problems are automagically solved.

The second, Lucas says, is ”science”! The first, Lucas says, is not–even though the first is a strict superset of the second.

And if econometric tools reject Lucas’s approach? Then, Lucas says, so much the worse for econometric tools. And if the data reject that approach? Then, Prescott says, so much the worse for the data: ”economic theory ahead of economic measurement”–our theories would not be falsified if we had the real data to work on …

Brad DeLong

Yours truly totally agrees — there exist overwhelmingly strong reasons for being critical and doubtful re microfoundations of macroeconomics.

Microfoundations today means more than anything else that you try to build macroeconomic models assuming “rational expectations” and hyperrational “representative actors” optimizing over time. Both are highly questionable assumptions.

Macroeconomic models building on rational expectations microfoundations impute beliefs to the agents that is not based on any real informational considerations, but simply stipulated to make the models mathematically-statistically tractable. Of course you can make assumptions based on tractability, but then you do also have to take into account the necessary trade-off in terms of the ability to make relevant and valid statements on the intended target system. Mathematical tractability cannot be the ultimate arbiter in science when it comes to modeling real world target systems. One could perhaps accept macroeconomic models building on rational expectations-microfoundations if they had produced lots of verified predictions and good explanations. But they have done nothing of the kind. Therefore the burden of proof is on those macroeconomists who still want to use models built on these particular unreal assumptions.

Macroeconomic models building on rational expectations microfoundations emanates from the belief that to be scientific, economics has to be able to model individuals and markets in a stochastic-deterministic way. It’s like treating individuals and markets as the celestial bodies studied by astronomers with the help of gravitational laws. Unfortunately, individuals, markets and entire economies are not planets moving in predetermined orbits in the sky.

Microfoundations -– and a fortiori rational expectations and representative agents -– serves a particular theoretical purpose. And as the history of macroeconomics during the last thirty years has shown, this Lakatosian microfoundation programme for macroeconomics is only methodologically consistent within the framework of a (deterministic or stochastic) general equilibrium analysis. In no other context has it been possible to incorporate these kind of microfoundations, with its “forward-looking optimizing individuals,” into macroeconomic models.

This is of course not by accident. General equilibrium theory is basically nothing else than an endeavour to consistently generalize the microeconomics of individuals and firms on to the macroeconomic level of aggregates.

But it obviously doesn’t work. The analogy between microeconomic behaviour and macroeconomic behaviour is misplaced. Empirically, science-theoretically and methodologically, neoclassical microfoundations for macroeconomics are defective. Tenable foundations for macroeconomics really have to be sought for elsewhere.

 

Reinhart & Rogoff finally get it right!

11 mars, 2015 kl. 08:39 | Publicerat i Economics | 5 kommentarer

Even after one of the most severe multi-year crises on record in the advanced economies, the received wisdom in policy circles clings to the notion that high-income countries are completely different from their emerging-market counterparts. The current phase of the official policy approach is predicated on the assumption that growth, financial stability, and debt sustainability can be achieved through a mix of austerity and forbearance (and some reform).a62edf0f39de560a219b7262163b0d45 The claim is that advanced countries do not need to resort to the more eclectic policies of emerging markets, including debt restructurings and conversions, higher inflation, capital controls, and other forms of financial repression. Now entering the sixth or seventh year (depending on the country) of crisis, output remains well below its pre-crisis peak in ten of the twelve crisis countries. The gap with potential output is even greater. Delays in accepting that desperate times call for desperate measures keeps raising the odds that, as documented here, this crisis may in the end surpass in severity the depression of the 1930s in a large number of countries.

Carmen Reinhart & Kenneth Rogoff

This time it seems as though it is — really — different. At last the light at the end of the austerity tunnel seeps through.

What is science?

10 mars, 2015 kl. 21:07 | Publicerat i Theory of Science & Methodology | 4 kommentarer

The primary aim of this study is the development of a systematic realist account of science. In this way I hope to provide a comprehensive alternative to the positivism that has usurped the title of science. I think that only the position developed here can do full justice to the rationality of scientific practice or sustain the intelligibility of such scientific activities as theoryconstruction and experimentation. And that while recent developments in the philosophy of science mark a great advance on positivism they must eventually prove vulnerable to positivist counter-attack, unless carried to the limit worked out here.

My subsidiary aim is thus to show once-and-for-all why no return to positivism is possible. This of course depends upon my primary aim.9781844672042-frontcoverFor any adequate answer to the critical metaquestion ‘what are the conditions of the plausibility of an account of science ?’ presupposes an account which is capable of thinking of those conditions as special cases. That is to say, to adapt an image of Wittgenstein’s, one can only see the fly in the fly-bottle if one’s perspective is different from that of the fly. And the sting is only removed from a system of thought when the particular conditions under which it makes sense are described. In practice this task is simplified for us by the fact that the conditions under which positivism is plausible as an account of science are largely co-extensive with the conditions under which experience is significant in science. This is of course an important and substantive question which we could say, echoing Kant, no account of science can decline, but positivism cannot ask, because (it will be seen) the idea of insignificant experiences transcends the very bounds of its thought.

This book is written in the context of vigorous critical activity in the philosophy of science. In the course of this the twin templates of the positivist view of science, viz. the ideas that science has a certain base and a deductive structure, have been subjected to damaging attack. With a degree of arbitrariness one can separate this critical activity into two strands. The first, represented by writers such as Kuhn, Popper, Lakatos, Feyerabend, Toulmin, Polanyi and Ravetz, emphasises the social character of science and focusses particularly on the phenomena of scientific change and development. It is generally critical of any monistic interpretation of scientific development, of the kind characteristic of empiricist historiography and implicit in any doctrine of the foundations of knowledge. The second strand, represented by the work of Scriven, Hanson, Hesse and Harré among others, calls attention to the stratification of science. It stresses the difference between explanation and prediction and emphasises the role played by models in scientific thought. It is highly critical of the deductivist view of the structure of scientific theories, and more generally of any exclusively formal account of science. This study attempts to synthesise these two critical strands; and to show in particular why and how the realism presupposed by the first strand must be extended to cover the objects of scientific thought postulated by the second strand. In this way I will be describing the nature and the development of what has been hailed as the ‘Copernican Revolution’ in the philosophy of science.

To see science as a social activity, and as structured and discriminating in its thought, constitutes a significant step in our understanding of science. But, I shall argue, without the support of a revised ontology, and in particular a conception of the world as stratified and differentiated too, it is impossible to steer clear of the Scylla of holding the structure dispensable in the long run (back to empiricism) without being pulled into the Charybdis of justifying it exlusively in terms of the fixed or changing needs of the scientific community (a form of neoKantian pragmatism exemplified by e.g. Toulmin and Kuhn). In this study I attempt to show how such a revised ontology is in fact presupposed by the social activity of science. The basic principle of realist philosophy of science, viz. that perception gives us access to things and experimental activity access to structures that exist independently of us, is very simple. Yet the full working out of this principle implies a radical account of the nature of causal laws, viz. as expressing tendencies of things, not conjunctions of events. And it implies that a constant conjunction of events is no more a necessary than a sufficient condition for a causal law.

Male circumcision — a case of selection bias

10 mars, 2015 kl. 17:01 | Publicerat i Statistics & Econometrics | 1 kommentar

Take a look at a map of Africa showing male circumcision rates, and impose on that data on HIV/AIDS prevalence. There is a very close correspondence between the two, with the exceptions being cities with large numbers of recent uncircumcised male migrants. One might therefore conclude that male circumcision reduces the chances of contracting HIV/AIDS, and indeed there are medical reasons to believe this may be so. But maybe some third, underlying variable, explains both circumcision and HIV/AIDS prevalence. meme2That is, those who select to get circumcised have special characteristics which make them less likely to contract HIV/AIDS, so a comparison of HIV/AIDS rates between circumcised and uncircumcised men will give a biased estimate of the impact of circumcision on HIV/AIDS prevalence. There is such a factor, it is being Muslim. Muslim men are circumcised and less likely to engage in risky sexual behaviour exposing themselves to HIV/AIDS, partly as they do not drink alcohol. Again we are not comparing like with like: circumcised men have different characteristics compared to uncircumcised men, and these characteristics affect the outcome of interest.

Howard White

Austerity — a class-specific put-option

10 mars, 2015 kl. 10:09 | Publicerat i Economics, Politics & Society | 1 kommentar

article4bWhen you bail out a bank or a banking system, you are not just bailing the bankers. You are bailing the savers, the pensions, the mortgages, the derivatives written on these loans and annuities, and all the rest that constitute the bank’s assets, which are your liabilities and vice versa. So when governments bail banks they are simultaneously bailing the assets and incomes of the top 30 percent of the income distribution …

The cost of exercising the put-option is paid for by people who don’t have many such assets and rely on government spending and public goods, but that’s what gets cut. The poorest segment of society is forced to pay out on an insurance policy that they never agreed to guarantee, and for which they never received a single insurance premium from the holders of the bailed (i.e. insured) assets. This is why austerity is best thought of as a class-specific put-option. It’s free asset insurance for the top end of the income distribution, those who also just happen to be the people that vote most and fund elections.

Mark Blyth

So much for ‘expansionary austerity’ solutions

10 mars, 2015 kl. 09:18 | Publicerat i Economics, Politics & Society | 5 kommentarer

Unemployment rates in Europe, Japan and US
Unemployment_rates_EU-28_EA-18_US_and_Japan_seasonally_adjusted_January_2000_January_2015

Source: Eurostat

If this is recovery for Europe, well, I’ll be dipped! Some years ago unemployment rates at these levels were considered totally unacceptable. And then came the Reagan-Thatcher turnover and price stability was everything and being unemployed was something people freely chose to be …

When should we — really — care about public debt?

9 mars, 2015 kl. 19:28 | Publicerat i Economics | 2 kommentarer

Nick Rowe has a silly question for those who oppose austerity … would you still advocate fiscal stimulus in a liquidity trap (with interest rates stuck at some lower bound – the ZLB) if government debt was ten times annual GDP? …

There are four main potential costs associated with high government debt. The first is that, by generating high real interest rates, it crowds out private capital. However at the ZLB long term real interest rates are likely to be low, not high. Second, paying the interest on that debt requires higher distortionary taxes … However if there is an output gap the possibility that people are not supplying labour because income taxes are too high is not a current problem either.

55446682A third issue with debt is the ‘burden on future generations’. How real that is or not, dealing with excessive debt is going to screw the current generation (who have to suffer the higher taxes or lower spending to get debt down), so asking them to also suffer continuing unemployment is hardly fair.

The final problem is that the markets might suddenly take fright that the tax burden implied by the debt is too large in political terms, and as a result the government may default …

Suppose … the funding does dry up. You have your own independent central bank, so you print the money to cover the stimulus and any debt rollover required. That might require a lot of money creation – perhaps as much as central banks have actually undertaken as a result of Quantitative Easing (QE)! Just as with QE, the world does not fall in. Will that not lead to massive inflation? No, for exactly the same reason QE does not. The moment the output gap has been eliminated, and interest rates are off the ZLB, you can start the austerity programme that begins to roll back money creation. That stops the output gap becoming positive and therefore stops inflation …

So I think the answer to Nick’s question is not the answer he thinks. The logic is that every time and whatever the numbers you first eliminate the output gap and get off the ZLB. Only when that is done do you start taking action to reduce deficits.

Simon Wren-Lewis

I notice again and again that on many macroeconomic policy issues I find myself in agreement with people like Wren-Lewis and Krugman. To me that just shows that they are right in spite of and not thanks to the kind of neoclassical models they ultimately refer to. When discussing austerity measures, Ricardian equivalence, euro problems, or public debt, they actually, as far as I can see, are not using those models, but rather (even) simpler and more adequate and relevant thought-constructions much more in the vein of Keynes.

So — has Wren-Lewis turned into MMT-Functional Finance? Probably not, but it sure sounds a lot like Abba Lerner …

national debt5One of the most effective ways of clearing up this most serious of all semantic confusions is to point out that private debt differs from national debt in being external. It is owed by one person to others. That is what makes it burdensome. Because it is interpersonal the proper analogy is not to national debt but to international debt…. But this does not hold for national debt which is owed by the nation to citizens of the same nation. There is no external creditor. We owe it to ourselves.

A variant of the false analogy is the declaration that national debt puts an unfair burden on our children, who are thereby made to pay for our extravagances. Very few economists need to be reminded that if our children or grandchildren repay some of the national debt these payments will be made to our children or grandchildren and to nobody else. Taking them altogether they will no more be impoverished by making the repayments than they will be enriched by receiving them.

Abba Lerner The Burden of the National Debt (1948)

‘Sizeless science’ and the cult of significance testing

9 mars, 2015 kl. 15:06 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för ‘Sizeless science’ and the cult of significance testing

A couple of years ago yours truly had an interesting luncheon discussion with Deirdre McCloskey on her controversy with Kevin Hoover on significance testing. It got me thinking about where the fetish status of significance testing comes from and why we are still teaching and practising it without serious qualifications despite its obvious inadequacies.

57626084A non-trivial part of teaching statistics is made up of learning students to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the proba-bilities generated by these statistical tests – p-values – really are, still most students misinterpret them.

Giving a statistics course for the Swedish National Research School in History, I asked the students at the exam to explain how one should correctly interpret p-values. Although the correct definition is p(data|null hypothesis), a majority of the students either misinterpreted the meaning of the p-value as being the likelihood of a sampling error (which of course is wrong, since the very computation of the p value is based on the assumption that sampling errors are what causes the sample statistics not coinciding with the null hypothesis), or that the meaning of the p-value is the probability of the null hypothesis being true, given the data (which is a case of the fallacy of transposing the conditional, and of course also being wrong, since that is p(null hypothesis|data) rather than the correct p(data|null hypothesis)).

This is not to blame on students’ ignorance, but rather on significance testing not being particularly transparent (conditional probability inference is difficult even to those of us who teach and practice it). A lot of researchers fall prey to the same mistakes. So — given that it anyway is very unlikely than any population parameter is exactly zero, and that contrary to assumption most samples in social science and economics are not random or having the right distributional shape — why continue to press students and researchers to do null hypothesis significance testing, testing that relies on weird backward logic that students and researchers usually don’t understand?

Reviewing Deirdre’s and Stephen Ziliak’s The Cult of Statistical Significance (University of Michigan Press 2008), mathematical statistician Olle Häggström succinctly summarizes what the debate is all about:

Stephen Ziliak and Deirdre McCloskey, claim in their recent book The Cult of Statistical Significance [ZM] that the reliance on statistical methods has gone too far and turned into a ritual and an obstacle to scientific progress.

A typical situation is the following. A scientist formulates a null hypothesis. By means of a significance test, she tries to falsify it. The analysis leads to a p-value, which indicates how likely it would have been, if the null hypothesis were true, to obtain data at least as extreme as those she actually got. If the p-value is below a certain prespecified threshold (typically 0.01 or 0.05), the result is deemed statistically significant, which, although far from constituting a definite disproof of the null hypothesis, counts as evidence against it.

Imagine now that a new drug for reducing blood pressure is being tested and that the fact of the matter is that the drug does have a positive effect (as compared with a placebo) but that the effect is so small that it is of no practical relevance to the patient’s health or well-being. If the study involves sufficiently many patients, the effect will nevertheless with high probability be detected, and the study will yield statistical significance. The lesson to learn from this is that in a medical study, statistical significance is not enough—the detected effect also needs to be large enough to be medically significant. Likewise, empirical studies in economics (or psychology, geology, etc.) need to consider not only statistical significance but also economic (psychological, geological, etc.) significance.

A major point in The Cult of Statistical Significance is the observation that many researchers are so obsessed with statistical significance that they neglect to ask themselves whether the detected discrepancies are large enough to be of any subject-matter significance. Ziliak and McCloskey call this neglect sizeless science …

The Cult of Statistical Significance is written in an entertaining and polemical style. Sometimes the authors push their position a bit far, such as when they ask themselves: “If nullhypothesis significance testing is as idiotic as we and its other critics have so long believed, how on earth has it survived?” (p. 240). Granted, the single-minded focus on statistical significance that they label sizeless science is bad practice. Still, to throw out the use of significance tests would be a mistake, considering how often it is a crucial tool for concluding with confidence that what we see really is a pattern, as opposed to just noise. For a data set to provide reasonable evidence of an important deviation from the null hypothesis, we typically need both statistical and subject-matter significance.

Statistical significance doesn’t say that something is important or true. Although Häggström has a point in his last remark, I still think – since there already are far better and more relevant testing that can be done (see e. g. my posts here and here) – it is high time to consider what should be the proper function of what has now really become a statistical fetish.

The Latvian experience shows why austerity sucks

8 mars, 2015 kl. 17:16 | Publicerat i Economics | Kommentarer inaktiverade för The Latvian experience shows why austerity sucks

 

« Föregående sidaNästa sida »

Blogga med WordPress.com.
Entries och kommentarer feeds.