Brad DeLong is wrong on realism and inference to the best explanation

31 August, 2015 at 13:43 | Posted in Theory of Science & Methodology | 4 Comments

Brad DeLong has a new post up where he gets critical about scientific realism and inference to the best explanation:

Daniel Little: The Case for Realism in the Social Realm:

“The case for scientific realism in the case of physics is a strong one…

hqdefaultThe theories… postulate unobservable entities, forces, and properties. These hypotheses… are not individually testable, because we cannot directly observe or measure the properties of the hypothetical entities. But the theories as wholes have a great deal of predictive and descriptive power, and they permit us to explain and predict a wide range of physical phenomena. And the best explanation of the success of these theories is that they are true: that the world consists of entities and forces approximately similar to those hypothesized in physical theory. So realism is an inference to the best explanation…”

“WTF?!” is the only reaction I can have when I read Daniel Little.

Ptolemy’s epicycles are a very good model of planetary motion–albeit not as good as General Relativity. Nobody believes that epicycles are real …

There is something there. But just because your theory is good does not mean that the entities in your theory are “really there”, whatever that might mean…

Although Brad sounds upset, I can’t really see any good reasons why.

In a time when scientific relativism is expanding, it is important to keep up the claim for not reducing science to a pure discursive level. We have to maintain the Enlightenment tradition of thinking of reality as principally independent of our views of it and of the main task of science as studying the structure of this reality. Perhaps the most important contribution a researcher can make is reveal what this reality that is the object of science actually looks like.

Science is made possible by the fact that there are structures that are durable and are independent of our knowledge or beliefs about them. There exists a reality beyond our theories and concepts of it. It is this independent reality that our theories in some way deal with. Contrary to positivism, I would as a critical realist argue that the main task of science is not to detect event-regularities between observed facts. Rather, that task must be conceived as identifying the underlying structure and forces that produce the observed events.

In a truly wonderful essay – chapter three of Error and Inference (Cambridge University Press, 2010, eds. Deborah Mayo and Aris Spanos) – Alan Musgrave gives strong arguments why scientific realism and inference to the best explanation are the best alternatives for explaining what’s going on in the world we live in:

For realists, the name of the scientific game is explaining phenomena, not just saving them. Realists typically invoke ‘inference to the best explanation’ or IBE …

IBE is a pattern of argument that is ubiquitous in science and in everyday life as well. van Fraassen has a homely example:
“I hear scratching in the wall, the patter of little feet at midnight, my cheese disappears – and I infer that a mouse has come to live with me. Not merely that these apparent signs of mousely presence will continue, not merely that all the observable phenomena will be as if there is a mouse, but that there really is a mouse.” (1980: 19-20)
Here, the mouse hypothesis is supposed to be the best explanation of the phenomena, the scratching in the wall, the patter of little feet, and the disappearing cheese.
alan musgraveWhat exactly is the inference in IBE, what are the premises, and what the conclusion? van Fraassen says “I infer that a mouse has come to live with me”. This suggests that the conclusion is “A mouse has come to live with me” and that the premises are statements about the scratching in the wall, etc. Generally, the premises are the things to be explained (the explanandum) and the conclusion is the thing that does the explaining (the explanans). But this suggestion is odd. Explanations are many and various, and it will be impossible to extract any general pattern of inference taking us from explanandum to explanans. Moreover, it is clear that inferences of this kind cannot be deductively valid ones, in which the truth of the premises guarantees the truth of the conclusion. For the conclusion, the explanans, goes beyond the premises, the explanandum. In the standard deductive model of explanation, we infer the explanandum from the explanans, not the other way around – we do not deduce the explanatory hypothesis from the phenomena, rather we deduce the phenomena from the explanatory hypothesis …

The intellectual ancestor of IBE is Peirce’s abduction, and here we find a different pattern:

The surprising fact, C, is observed.
But if A were true, C would be a matter of course.
Hence, … A is true.
(C. S. Peirce, 1931-58, Vol. 5: 189)

Here the second premise is a fancy way of saying “A explains C”. Notice that the explanatory hypothesis A figures in this second premise as well as in the conclusion. The argument as a whole does not generate the explanans out of the explanandum. Rather, it seeks to justify the explanatory hypothesis …

Abduction is deductively invalid … IBE attempts to improve upon abduction by requiring that the explanation is the best explanation that we have. It goes like this:

F is a fact.
Hypothesis H explains F.
No available competing hypothesis explains F as well as H does.
Therefore, H is true
(William Lycan, 1985: 138)

This is better than abduction, but not much better. It is also deductively invalid …

There is a way to rescue abduction and IBE. We can validate them without adding missing premises that are obviously false, so that we merely trade obvious invalidity for equally obvious unsoundness. Peirce provided the clue to this. Peirce’s original abductive scheme was not quite what we have considered so far. Peirce’s original scheme went like this:

The surprising fact, C, is observed.
But if A were true, C would be a matter of course.
Hence, there is reason to suspect that A is true.
(C. S. Peirce, 1931-58, Vol. 5: 189)

This is obviously invalid, but to repair it we need the missing premise “There is reason to suspect that any explanation of a surprising fact is true”. This missing premise is, I suggest, true. After all, the epistemic modifier “There is reason to suspect that …” weakens the claims considerably. In particular, “There is reason to suspect that A is true” can be true even though A is false. If the missing premise is true, then instances of the abductive scheme may be both deductively valid and sound.

IBE can be rescued in a similar way. I even suggest a stronger epistemic modifier, not “There is reason to suspect that …” but rather “There is reason to believe (tentatively) that …” or, equivalently, “It is reasonable to believe (tentatively) that …” What results, with the missing premise spelled out, is:

It is reasonable to believe that the best available explanation of any fact is true.
F is a fact.
Hypothesis H explains F.
No available competing hypothesis explains F as well as H does.
Therefore, it is reasonable to believe that H is true.

This scheme is valid and instances of it might well be sound. Inferences of this kind are employed in the common affairs of life, in detective stories, and in the sciences.

Of course, to establish that any such inference is sound, the ‘explanationist’ owes us an account of when a hypothesis explains a fact, and of when one hypothesis explains a fact better than another hypothesis does. If one hypothesis yields only a circular explanation and another does not, the latter is better than the former. If one hypothesis has been tested and refuted and another has not, the latter is better than the former. These are controversial issues, to which I shall return. But they are not the most controversial issue – that concerns the major premise. Most philosophers think that the scheme is unsound because this major premise is false, whatever account we can give of explanation and of when one explanation is better than another. So let me assume that the explanationist can deliver on the promises just mentioned, and focus on this major objection.

People object that the best available explanation might be false. Quite so – and so what? It goes without saying that any explanation might be false, in the sense that it is not necessarily true. It is absurd to suppose that the only things we can reasonably believe are necessary truths.

What if the best explanation not only might be false, but actually is false. Can it ever be reasonable to believe a falsehood? Of course it can. Suppose van Fraassen’s mouse explanation is false, that a mouse is not responsible for the scratching, the patter of little feet, and the disappearing cheese. Still, it is reasonable to believe it, given that it is our best explanation of those phenomena. Of course, if we find out that the mouse explanation is false, it is no longer reasonable to believe it. But what we find out is that what we believed was wrong, not that it was wrong or unreasonable for us to have believed it.

People object that being the best available explanation of a fact does not prove something to be true or even probable. Quite so – and again, so what? The explanationist principle – “It is reasonable to believe that the best available explanation of any fact is true” – means that it is reasonable to believe or think true things that have not been shown to be true or probable, more likely true than not.

I do appreciate when mainstream economists like Brad make an effort at doing some methodological-ontological-epistemological reflection. On this issue, unfortunately — although it’s always interesting and thought-provoking to read what Brad has to say — his arguments are too weak to warrant the negative stance on scientific realism and inference to the best explanation.

Unbiased econometric estimates? Forget it!

30 August, 2015 at 20:42 | Posted in Statistics & Econometrics | Leave a comment

Following our recent post on econometricians’ traditional privileging of unbiased estimates, there were a bunch of comments echoing the challenge of teaching this topic, as students as well as practitioners often seem to want the comfort of an absolute standard such as best linear unbiased estimate or whatever. Commenters also discussed the tradeoff between bias and variance, and the idea that unbiased estimates can overfit the data.

I agree with all these things but I just wanted to raise one more point: In realistic settings, unbiased estimates simply don’t exist. In the real world we have nonrandom samples, measurement error, nonadditivity, nonlinearity, etc etc etc.

So forget about it. We’re living in the real world …

figure3

It’s my impression that many practitioners in applied econometrics and statistics think of their estimation choice kinda like this:

1. The unbiased estimate. It’s the safe choice, maybe a bit boring and maybe not the most efficient use of the data, but you can trust it and it gets the job done.

2. A biased estimate. Something flashy, maybe Bayesian, maybe not, it might do better but it’s risky. In using the biased estimate, you’re stepping off base—the more the bias, the larger your lead—and you might well get picked off …

If you take the choice above and combine it with the unofficial rule that statistical significance is taken as proof of correctness (in econ, this would also require demonstrating that the result holds under some alternative model specifications, but “p less than .05″ is still key), then you get the following decision rule:

A. Go with the safe, unbiased estimate. If it’s statistically significant, run some robustness checks and, if the result doesn’t go away, stop.

B. If you don’t succeed with A, you can try something fancier. But . . . if you do that, everyone will know that you tried plan A and it didn’t work, so people won’t trust your finding.

So, in a sort of Gresham’s Law, all that remains is the unbiased estimate. But, hey, it’s safe, conservative, etc, right?

And that’s where the present post comes in. My point is that the unbiased estimate does not exist! There is no safe harbor. Just as we can never get our personal risks in life down to zero … there is no such thing as unbiasedness. And it’s a good thing, too: recognition of this point frees us to do better things with our data right away.

Andrew Gelman

Kvinde min (private)

30 August, 2015 at 13:11 | Posted in Varia | Leave a comment


04bace0As always, for you, Jeanette Meyer

‘New Keynesian’ models are not too simple. They are just wrong.

30 August, 2015 at 11:30 | Posted in Economics | 3 Comments

kSimon Wren-Lewis has a nice post discussing Paul Romer’s critique of macro. In Simon’s words:

“It is hard to get academic macroeconomists trained since the 1980s to address [large scale Keynesian models] , because they have been taught that these models and techniques are fatally flawed because of the Lucas critique and identification problems … But DSGE models as a guide for policy are also fatally flawed because they are too simple. The unique property that DSGE models have is internal consistency … Take a DSGE model, and alter a few equations so that they fit the data much better, and you have what could be called a structural econometric model. It is internally inconsistent, but because it fits the data better it may be a better guide for policy.”

Nope! Not too simple. Just wrong!

I disagree with Simon. NK models are not too simple. They are simply wrong. There are no ‘frictions’. There is no Calvo Fairy. There are simply persistent nominal beliefs.

Period.

Roger Farmer

Yes indeed. There really is something about the way macroeconomists construct their models nowadays that obviously doesn’t sit right.

Empirical evidence only plays a minor role in neoclassical mainstream economic theory, where models largely function as a substitute for empirical evidence. One might have hoped that humbled by the manifest failure of its theoretical pretences during the latest economic-financial crisis, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics would give way to methodological pluralism based on ontological considerations rather than formalistic tractability. That has, so far, not happened.

Fortunately — when you’ve got tired of the kind of macroeconomic apologetics produced by “New Keynesian” macroeconomists and other DSGE modellers — there still are some real Keynesian macroeconomists to read. One of them — Axel Leijonhufvud — writes:

For many years now, the main alternative to Real Business Cycle Theory has been a somewhat loose cluster of models given the label of New Keynesian theory. New Keynesians adhere on the whole to the same DSGE modeling technology as RBC macroeconomists but differ in the extent to which they emphasise inflexibilities of prices or other contract terms as sources of shortterm adjustment problems in the economy. The “New Keynesian” label refers back to the “rigid wages” brand of Keynesian theory of 40 or 50 years ago. Except for this stress on inflexibilities this brand of contemporary macroeconomic theory has basically nothing Keynesian about it …

I conclude that dynamic stochastic general equilibrium theory has shown itself an intellectually bankrupt enterprise. But this does not mean that we should revert to the old Keynesian theory that preceded it (or adopt the New Keynesian theory that has tried to compete with it). What we need to learn from Keynes … are about how to view our responsibilities and how to approach our subject.

If macroeconomic models — no matter of what ilk — build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Incompatibility between actual behaviour and the behaviour in macroeconomic models building on representative actors and rational expectations-microfoundations is not a symptom of “irrationality”. It rather shows the futility of trying to represent real-world target systems with models flagrantly at odds with reality.

A gadget is just a gadget — and no matter how brilliantly silly DSGE models you come up with, they do not help us working with the fundamental issues of modern economies. Using DSGE models only confirms Robert Gordon‘s  dictum that today

rigor competes with relevance in macroeconomic and monetary theory, and in some lines of development macro and monetary theorists, like many of their colleagues in micro theory, seem to consider relevance to be more or less irrelevant.

Funeral Ikos

30 August, 2015 at 09:31 | Posted in Varia | Leave a comment

 

tavenerIf thou hast shown mercy
unto man, o man,
that same mercy
shall be shown thee there;
and if on an orphan
thou hast shown compassion,
that same shall there
deliver thee from want.
If in this life
the naked thou hast clothed,
the same shall give thee
shelter there,
and sing the psalm:
Alleluia.
 
 
 
 
 

A life without the music of people like John Tavener and Arvo Pärt would be unimaginable.

What can economists know?

29 August, 2015 at 12:15 | Posted in Economics | 4 Comments

The early concerns voiced by such critics as Keynes and Hayek, while they may indeed have been exaggerated, were not misplaced. 51ffpHXDowL._SX326_BO1,204,203,200_I believe that much of the difficulty economists have encountered over the past fifty years can be traced to the fact that the economic environment we seek to model are sometimes too messy to be fitted into the mold of a well-behaved, complete model of the standard kind. It is not generally the case that some sharp dividing line separates a set of important systematic influences that we can measure, proxy, or control for, from the many small unsystematic influences that we can bundle into a ‘noise’ term. So when we set out to test economic theories in the framework of the standard paradigm, we face quite serious and deep-seated difficulties. The problem of model selection may be such that the embedded test ends up being inconclusive, or unpersuasive.

Sutton’s Gaston Eyskens Lectures forcefully show what a gross misapprehension it is to — as most mainstream economists today — hold the view that criticisms of econometrics are the conclusions of sadly misinformed and misguided people who dislike and do not understand much of it. To be careful and cautious is not the same as to dislike. As any perusal of the mathematical-statistical and philosophical works of people like for example David Freedman, Rudolf Kalman, John Maynard Keynes, and Tony Lawson show, the critique is put forward by respected authorities. I would argue, against “common knowledge” and in line with Sutton, that they do not misunderstand the crucial issues at stake in the development of econometrics. Quite the contrary. They know them all too well — and are not satisfied with the validity and philosophical underpinning of the assumptions made for applying its methods.

Although advances have been made using a modern empiricist approach in modern econom(etr)ics, there are still some unsolved “problematics” with its epistemological and ontological presuppositions. There is, e. g., an implicit assumption that the data generating process (DGP) fundamentally has an invariant property and that models that are structurally unstable just have not been able to get hold of that invariance. But, as already Keynes maintained, one cannot just presuppose or take for granted that kind of invariance. It has to be argued and justified. Grounds have to be given for viewing reality as satisfying conditions of model-closure. It is as if the lack of closure that shows up in the form of structurally unstable models somehow could be solved by searching for more autonomous and invariable “atomic uniformity”. But if reality is “congruent” to this analytical prerequisite has to be argued for, and not simply taken for granted.

Even granted that closures come in degrees, we should not compromise on ontology. Some methods simply introduce improper closures, closures that make the disjuncture between models and real world target systems inappropriately large. Garbage in, garbage out.

Underlying the search for these immutable “fundamentals” lays the implicit view of the world as consisting of material entities with their own separate and invariable effects. These entities are thought of as being able to be treated as separate and addible causes, thereby making it possible to infer complex interaction from knowledge of individual constituents with limited independent variety. But if this is a justified analytical procedure cannot be answered without confronting it with the nature of the objects the models are supposed to describe, explain or predict. Keynes himself thought it generally inappropriate to apply the “atomic hypothesis” to such an open and “organic entity” as the real world. As far as I can see these are still appropriate strictures all econometric approaches have to face. Grounds for believing otherwise have to be provided by the econometricians.

Trygve Haavelmo, the “father” of modern probabilistic econometrics, wrote that he and other econometricians could not “build a complete bridge between our models and reality” by logical operations alone, but finally had to make “a non-logical jump.” A part of that jump consisted in that econometricians “like to believe … that the various a priori possible sequences would somehow cluster around some typical time shapes, which if we knew them, could be used for prediction.” But why the “logically conceivable” really should turn out to be the case is difficult to see. At least if we are not satisfied by sheer hope. Keynes, as already noted, reacted against using unargued for and unjustified assumptions of complex structures in an open system being reducible to those of individuals. In real economies it is unlikely that we find many “autonomous” relations and events.

Has macroeconomics — really — progressed?

25 August, 2015 at 09:32 | Posted in Economics | 3 Comments

A typical DSGE model has a key property that from my work seems wrong. A good example is the model in Galí and Gertler (2007). In this model a positive price shock—a ‘‘cost push” shock — is explosive unless the Fed raises the nominal interest rate more than the increase in the inflation rate. get-off-your-assumptionsIn other words, positive price shocks with the nominal interest rate held constant are expansionary (because the real interest rate falls). In my work, however, they are contractionary. If there is a positive price shock like an oil price increase, nominal wages lag output prices, and so the real wage initially falls. This has a negative effect on consumption. In addition, household real wealth falls because nominal asset prices don’t initially rise as much as the price level. This has a negative effect on consumption through a wealth effect. There is little if any offset from lower real interest rates because households appear to respond more to nominal rates than to real rates. Positive price shocks are thus contractionary even if the Fed keeps the nominal interest rate unchanged. This property is important for a monetary authority in deciding how to respond to a positive price shock. If the authority used the Galí and Gertler (2007) model, it would likely raise the nominal interest rate too much thinking that the price shock is otherwise expansionary. Typical DSGE models are thus likely to be misleading for guiding monetary policy if this key property of the models is wrong.

Ray C. Fair

Krugman is right — public debt is good!

22 August, 2015 at 11:13 | Posted in Economics | 9 Comments

dec3bb27f72875e4fb4d4b62daebb2fd161b36392c1a0626f00cfd2ece207d84The U.S. economy has, on the whole, done pretty well these past 180 years, suggesting that having the government owe the private sector money might not be all that bad a thing. The British government, by the way, has been in debt for more than three centuries, an era spanning the Industrial Revolution, victory over Napoleon, and more.

But is the point simply that public debt isn’t as bad as legend has it? Or can government debt actually be a good thing?

Believe it or not, many economists argue that the economy needs a sufficient amount of public debt out there to function well. And how much is sufficient? Maybe more than we currently have. That is, there’s a reasonable argument to be made that part of what ails the world economy right now is that governments aren’t deep enough in debt.

Paul Krugman

Indeed.

Krugman is absolutely right.

Why?

Through history public debts have gone up and down, often expanding in periods of war or large changes in basic infrastructure and technologies, and then going down in periods when things have settled down.

The pros and cons of public debt have been put forward for as long as the phenomenon itself has existed, but it has, notwithstanding that, not been possible to reach anything close to consensus on the issue — at least not in a long time-horizon perspective. One has as a rule not even been able to agree on whether public debt is a problem, and if — when it is or how to best tackle it. Some of the more prominent reasons for this non-consensus are the complexity of the issue, the mingling of vested interests, ideology, psychological fears, the uncertainty of calculating ad estimating inter-generational effects, etc., etc.

In classical economics — following in the footsteps of David Hume – especially Adam Smith, David Ricardo, and Jean-Baptiste Say put forward views on public debt that was as a rule negative. The good budget was a balanced budget. If government borrowed money to finance its activities, it would only give birth to “crowding out” private enterprise and investments. The state was generally considered incapable if paying its debts, and the real burden would therefor essentially fall on the taxpayers that ultimately had to pay for the irresponsibility of government. The moral character of the argumentation was a salient feature — according to Hume, “either the nation must destroy public credit, or the public credit will destroy the nation.”

Later on in the 20th century economists like John Maynard Keynes, Abba Lerner and Alvin Hansen would hold a more positive view on public debt. Public debt was normally nothing to fear, especially if it was financed within the country itself (but even foreign loans could be beneficient for the economy if invested in the right way). Some members of society would hold bonds and earn interest on them, while others would have to pay the taxes that ultimately paid the interest on the debt. But the debt was not considered a net burden for society as a whole, since the debt cancelled itself out between the two groups. If the state could issue bonds at a low interest rate, unemployment could be reduced without necessarily resulting in strong inflationary pressure. And the inter-generational burden was no real burden according to this group of economists, since — if used in a suitable way — the debt would, through its effects on investments and employment, actually be net winners. There could, of course, be unwanted negative distributional side effects, for the future generation, but that was mostly considered a minor problem since, as  Lerner put it,“if our children or grandchildren repay some of the national debt these payments will be made to our children and grandchildren and to nobody else.”

Central to the Keynesian influenced view is the fundamental difference between private and public debt. Conflating the one with the other is an example of the atomistic fallacy, which is basically a variation on Keynes’ savings paradox. If an individual tries to save and cut down on debts, that may be fine and rational, but if everyone tries to do it, the result would be lower aggregate demand and increasing unemployment for the economy as a whole.

An individual always have to pay his debts. But a government can always pay back old debts with new, through the issue of new bonds. The state is not like an individual. Public debt is not like private debt. Government debt is essentially a debt to itself, its citizens. Interest paid on the debt is paid by the taxpayers on the one hand, but on the other hand, interest on the bonds that finance the debts goes to those who lend out the money.

To both Keynes and Lerner it was evident that the state had the ability to promote full employment and a stable price level – and that it should use its powers to do so. If that meant that it had to take on a debt and (more or less temporarily) underbalance its budget – so let it be! Public debt is neither good nor bad. It is a means to achieving two over-arching macroeconomic goals – full employment and price stability. What is sacred is not to have a balanced budget or running down public debt per se, regardless of the effects on the macroeconomic goals. If “sound finance”, austerity and a balanced budgets means increased unemployment and destabilizing prices, they have to be abandoned.

Now against this reasoning, exponents of the thesis of Ricardian equivalence, have maintained that whether the public sector finances its expenditures through taxes or by issuing bonds is inconsequential, since bonds must sooner or later be repaid by raising taxes in the future.

In the 1970s Robert Barro attempted to give the proposition a firm theoretical foundation, arguing that the substitution of a budget deficit for current taxes has no impact on aggregate demand and so budget deficits and taxation have equivalent effects on the economy.

The Ricardo-Barro hypothesis, with its view of public debt incurring a burden for future generations, is the dominant view among mainstream economists and politicians today. The rational people making up the actors in the model are assumed to know that today’s debts are tomorrow’s taxes. But — one of the main problems with this standard neoclassical theory is, however, that it doesn’t fit the facts.

From a more theoretical point of view, one may also strongly criticize the Ricardo-Barro model and its concomitant crowding out assumption, since perfect capital markets do not exist and repayments of public debt can take place far into the future and it’s dubious if we really care for generations 300 years from now.

Today there seems to be a rather widespread consensus of public debt being acceptable as long as it doesn’t increase too much and too fast. If the public debt-GDP ratio becomes higher than X % the likelihood of debt crisis and/or lower growth increases.

But in discussing within which margins public debt is feasible, the focus, however, is solely on the upper limit of indebtedness, and very few asks the question if maybe there is also a problem if public debt becomes too low.

The government’s ability to conduct an “optimal” public debt policy may be negatively affected if public debt becomes too small. To guarantee a well-functioning secondary market in bonds it is essential that the government has access to a functioning market. If turnover and liquidity in the secondary market becomes too small, increased volatility and uncertainty will in the long run lead to an increase in borrowing costs. Ultimately there’s even a risk that market makers would disappear, leaving bond market trading to be operated solely through brokered deals. As a kind of precautionary measure against this eventuality it may be argued – especially in times of financial turmoil and crises — that it is necessary to increase government borrowing and debt to ensure – in a longer run – good borrowing preparedness and a sustained (government) bond market.

The question if public debt is good and that we may actually have to little of it is one of our time’s biggest questions. Giving the wrong answer to it — as Krugman notices — will be costly:

The great debt panic that warped the U.S. political scene from 2010 to 2012, and still dominates economic discussion in Britain and the eurozone, was even more wrongheaded than those of us in the anti-austerity camp realized.

Not only were governments that listened to the fiscal scolds kicking the economy when it was down, prolonging the slump; not only were they slashing public investment at the very moment bond investors were practically pleading with them to spend more; they may have been setting us up for future crises.

And the ironic thing is that these foolish policies, and all the human suffering they created, were sold with appeals to prudence and fiscal responsibility.

How evidence is treated in macroeconomics

21 August, 2015 at 12:14 | Posted in Economics | 10 Comments

“New Keynesian” macroeconomist Simon Wren-Lewis has a post up on his blog, discussing how evidence is treated in modern macroeconomics (emphasis added):

quote-Oscar-Wilde-consistency-is-the-last-refuge-of-the-58It is hard to get academic macroeconomists trained since the 1980s to address this question, because they have been taught that these models and techniques are fatally flawed because of the Lucas critique and identification problems. But DSGE models as a guide for policy are also fatally flawed because they are too simple. The unique property that DSGE models have is internal consistency. Take a DSGE model, and alter a few equations so that they fit the data much better, and you have what could be called a structural econometric model. It is internally inconsistent, but because it fits the data better it may be a better guide for policy.

Being able to model a credible world, a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified (in terms of resemblance, relevance, etc.). At the very least, the minimalist demand on models in terms of credibility has to give away to a stronger epistemic demand of appropriate similarity and plausibility. One could of course also ask for a sensitivity or robustness analysis, but the credible world, even after having tested it for sensitivity and robustness, can still be a far way from reality – and unfortunately often in ways we know are important. Robustness of claims in a model does not per se give a warrant for exporting the claims to real world target systems.

Questions of external validity are important more specifically also when it comes to microfounded DSGE macromodels. It can never be enough that these models somehow are regarded as internally consistent. One always also has to pose questions of consistency with the data. Internal consistency without external validity is worth nothing.

Yours truly and people like Tony Lawson have for many years been urging economists to pay attention to the ontological foundations of their assumptions and models. Sad to say, economists have not paid much attention — and so modern economics has become increasingly irrelevant to the understanding of the real world.

an-unconvenient-truthWithin mainstream economics internal validity is still everything and external validity nothing. Why anyone should be interested in that kind of theories and models is beyond imagination. As long as mainstream economists do not come up with any export-licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism!

Since fully-fledged experiments on a societal scale as a rule are prohibitively expensive, ethically indefensible or unmanageable, economic theorists have to substitute experimenting with something else. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models” rather than engineering things happening in real economies.

Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality.

Neoclassical economics has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.

To have valid evidence is not enough. What economics needs is sound evidence. Why? Simply because the premises of a valid argument do not have to be true, but a sound argument, on the other hand, is not only valid, but builds on premises that are true. Aiming only for validity, without soundness, is setting the economics aspirations level too low for developing a realist and relevant science.

Lucas’ caricature of economic science

20 August, 2015 at 19:39 | Posted in Economics | 4 Comments

uN7GNNaAs Lucas himself wrote in an autobiographical sketch for Lives of the Laureates, he was bewitched by the beauty and power of Samuelson’s Foundations of Economic Analysis when he read it the summer before starting his training as a graduate student at Chicago in 1960. Although it did not have the transformative effect on me that it had on Lucas, I greatly admire the Foundations, but regardless of whether Samuelson himself meant to suggest such an idea (which I doubt), it is absurd to draw this conclusion from it:

“I loved the Foundations. Like so many others in my cohort, I internalized its view that if I couldn’t formulate a problem in economic theory mathematically, I didn’t know what I was doing. I came to the position that mathematical analysis is not one of many ways of doing economic theory: It is the only way. Economic theory is mathematical analysis. Everything else is just pictures and talk.”

Oh, come on. Would anyone ever think that unless you can formulate the problem of whether the earth revolves around the sun or the sun around the earth mathematically, you don’t know what you are doing? …

Lucas … internalized the caricature he extracted from Samuelson’s Foundations: that mathematical analysis is the only legitimate way of doing economic theory, and that, in particular, the essence of macroeconomics consists in a combination of axiomatic formalism and philosophical reductionism (microfoundationalism). For Lucas, the only scientifically legitimate macroeconomic models are those that can be deduced from the axiomatized Arrow-Debreu-McKenzie general equilibrium model, with solutions that can be computed and simulated in such a way that the simulations can be matched up against the available macroeconomics time series on output, investment and consumption.

This was both bad methodology and bad science, restricting the formulation of economic problems to those for which mathematical techniques are available to be deployed in finding solutions. On the one hand, the rational-expectations assumption made finding solutions to certain intertemporal models tractable; on the other, the assumption was justified as being required by the rationality assumptions of neoclassical price theory.

David Glasner

Lucas hope of being able to mathematically model the economy as “a FORTRAN program” and “gain some confidence that the component parts of the program are in some sense reliable prior to running it” seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in pure mathematics shows that if you neglect ontological considerations pertaining to the target system, ultimately reality returns with a vengeance when at last questions of bridging and exportation of mathematical model exercises are laid on the table. No matter how precise and rigorous the analysis is, and no matter how hard one tries to cast the argument in “modern mathematical form” they do not push science forwards one millimeter if they do not stand the acid test of relevance to the target. No matter how rigorous the inferences delivered inside these models are, they do not per se say anything about their external validity.

Next Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.