Model validation and significance testing

17 Apr, 2015 at 10:28 | Posted in Economics | Comments Off on Model validation and significance testing

In its standard form, a significance test is not the kind of “severe test” that we are looking for in our search for being able to confirm or disconfirm empirical scientific hypothesis. This is problematic for many reasons, one being that there is a strong tendency to accept the null hypothesis since they can’t be rejected at the standard 5% significance level. In their standard form, significance tests bias against new hypotheses by making it hard to disconfirm the null hypothesis.

35mm_12312_ 023And as shown over and over again when it is applied, people have a tendency to read “not disconfirmed” as “probably confirmed.” Standard scientific methodology tells us that when there is only say a 10 % probability that pure sampling error could account for the observed difference between the data and the null hypothesis, it would be more “reasonable” to conclude that we have a case of disconfirmation. Especially if we perform many independent tests of our hypothesis and they all give about the same 10 % result as our reported one, I guess most researchers would count the hypothesis as even more disconfirmed.

freed1Most importantly — we should never forget that the underlying parameters we use when performing significance tests are model constructions. Our p-values mean next to nothing if the model is wrong. As eminent mathematical statistician David Freedman writes:

I believe model validation to be a central issue. Of course, many of my colleagues will be found to disagree. For them, fitting models to data, computing standard errors, and performing significance tests is “informative,” even though the basic statistical assumptions (linearity, independence of errors, etc.) cannot be validated. This position seems indefensible, nor are the consequences trivial. Perhaps it is time to reconsider.

Invariance assumptions and economic theory (wonkish)

17 Apr, 2015 at 09:04 | Posted in Economics | Comments Off on Invariance assumptions and economic theory (wonkish)

41svIj0RdVLInvariance assumptions need to be made in order to draw causal conclusions from non-experimental data: parameters are invariant to interventions, and so are errors or their distributions. Exogeneity is another concern. In a real example, as opposed to a hypothetical, real questions would have to be asked about these assumptions. Why are the equations “structural,” in the sense that the required invariance assumptions hold true? Applied papers seldom address such assumptions, or the narrower statistical assumptions: for instance, why are errors IID?

The tension here is worth considering. We want to use regression to draw causal inferences from non-experimental data. To do that, we need to know that certain parameters and certain distributions would remain invariant if we were to intervene. Invariance can seldom be demonstrated experimentally. If it could, we probably wouldn’t be discussing invariance assumptions. What then is the source of the knowledge?

“Economic theory” seems like a natural answer, but an incomplete one. Theory has to be anchored in reality. Sooner or later, invariance needs empirical demonstration, which is easier said than done.

The Coase Theorem

16 Apr, 2015 at 12:29 | Posted in Economics | 3 Comments

Coase thm

Examining the Coase Theorem relies on a critical analysis of economic theory. The fundamental shortcomings of the most developed theory of the market, general equilibrium theory, as well as the restrictions imposed by the use of partial equilibrium and cases of a bilateral monopoly, undermine the assertions of the Coase Theorem. In the case of a bilateral monopoly, this construct involves serious distributional problems, and the invariance component of the theorem is seriously called into question. In addition, it is possible that the negotiations process may stop when mutually beneficial transactions take place outside of the contract curve. In those cases, social efficiency in the restricted Pareto-optimum sense will not be the outcome

Faith in the idea that markets allocate resources efficiently is severely shaken by the set of difficulties in general equilibrium theory discussed in this article. The shortcomings of general equilibrium theory in stability theory should alert anyone tempted by the Law and Economics (L&E) movement and its
applicability to fields of legal practice. The bottom line is that we do not have a theory showing how, if at all, markets reach equilibrium allocations. Because efficiency, in terms of Pareto-optimality, is an attribute only of equilibrium allocations, very serious negative implications exist for anyone claiming that markets allocate resources efficiently.

We have concentrated our critique of L&E based on the fact that economic theory is in a very sad state. Proponents of L&E seem to ignore this, appearing instead to believe that there exists somewhere a robust theoretical construct that satisfactorily explains how markets allocate resources efficiently– this article has shown such faith to be groundless. This should be enough to dismiss L&E as another example of the triumph of ideology over science. In addition, the extreme version of L&E transforms justice into a commodity and represents a disturbing backward movement in social thought. The critiques raised in this article should also suffice to call into question the idea that the main objective of legal systems is efficiency, and that efficiency is attained through the market system. There are no grounds to believe in the efficiency of the market system.

One final thought on the role of mathematics is important. In its development, economics as a discipline has been obsessed with the use of mathematical models to build a theory of competitive markets. The only function for the very awkward assumptions … was to allow the theoretician to have access to certain mathematical theorems. Functioning in this manner, economic theory has sacrificed the construction of relevant economic concepts for the sake of using mathematical tools. This is not how scientific discourse should advance, and the followers of L&E are probably not aware of this. In fact, they may have fallen victim to the illusion of scientific rigor conferred by the use, and abuse, of mathematics.

Alejandro Nadal

[For yours truly’s own take on the Coase Theorem and Law & Economics — in Swedish only, sorry — see here or my “Dr Pangloss, Coase och välfärdsteorins senare öden,” Zenit, 4/1996]

Confidence — the fairy that turned out to be a witch

16 Apr, 2015 at 10:28 | Posted in Economics | Comments Off on Confidence — the fairy that turned out to be a witch

65a767700602f1869d169a6150792046Remember the old times? Here is a quote from ECB President Jean-Claude Trichet … on September 3rd, 2010:

“We encourage all countries to be absolutely determined to go back to a sustainable mode for their fiscal policies,” Trichet said, speaking after the ECB rate decision on Thursday. “Our message is the same for all, and we trust that it is absolutely decisive not only for each country individually, but for prosperity of all.”

“Not because it is an elementary recommendation to care for your sons and daughter and not overburden them, but because it is good for confidence, consumption and investment today”.

Well, think again. Here is the abstract of ECB Working Paper no 1770, March 2015:

“We explore how fiscal consolidations affect private sector confidence, a possible channel for the fiscal transmission that has received particular attention recently as a result of governments embarking on austerity trajectories in the aftermath of the crisis … The effects are stronger for revenue-based measures and when institutional arrangements, such as fiscal rules, are weak … Consumer confidence falls around announcements of consolidation measures, an effect driven by revenue-based measures. Moreover, the effects are most relevant for European countries with weak institutional arrangements, as measured by the tightness of fiscal rules or budgetary transparency.”

The confidence fairy seems to have turned into a confidence witch. One more victim of the crisis. But this one will not be missed.

Francesco Saraceno

Economists — arrogant and self-congratulatory autists

15 Apr, 2015 at 15:49 | Posted in Economics | 2 Comments

Ten years ago, a survey published in the Journal of Economic Perspectives found that 77 percent of the doctoral candidates in the leading American economics programs agreed or strongly agreed with the statement “economics is the most scientific of the social sciences.”

autistic opinionIn the intervening decade, a massive economic crisis rocked the global economy, and most economists never saw it coming. Nevertheless, little has changed: A new paper from the same publication reveals how economists continue to believe that their science is superior to all other social sciences, such as political science, sociology, anthropology, etc. While there may be budding intentions to appeal to other disciplines in order to enrich their theories (especially psychology and neuroscience), the reality is that economists almost exclusively study—and cite—each other …

The world is still living with the effects of the most recent economic crisis, and the inability of economists to offer solutions with a significant degree of agreement shows how urgently their discipline needs to be disrupted by an injection of new ideas, methods, and assumptions about human behavior. Unfortunately, there are powerful obstacles to this disruption: elite control and lack of gender diversity …

Ten years ago, I suggested that economists would “be well advised to trade in their intellectual haughtiness for a more humble disposition.” That’s advice that has yet to be heeded.

Moisés Naím/The Atlantic

My new book is out

14 Apr, 2015 at 19:37 | Posted in Economics | 2 Comments

wea-ebookcover-syll-225x300“A wonderful set of clearly written and highly informative essays by a scholar who is knowledgeable, critical and sharp enough to see how things really are in the discipline, and honest and brave enough to say how things are. A must read especially for those truly concerned and/or puzzled about the state of modern economics.”

Tony Lawson

Table of Contents
What is (wrong with) economic theory?
Capturing causality in economics and the limits of statistical inference
Microfoundations – spectacularly useless and positively harmful
Economics textbooks – anomalies and transmogrification of truth
Rational expectations – a fallacious foundation for macroeconomics
Neoliberalism and neoclassical economics
The limits of marginal productivity theory

About the author
Lars Pålsson Syll received a PhD in economic history in 1991 and a PhD in economics in 1997, both at Lund University, Sweden. Since 2004 he has been professor of social science at Malmö University, Sweden. His primary research areas have been in the philosophy and methodology of economics, theories of distributive justice, and critical realist social science. As philosopher of science and methodologist he is a critical realist and an outspoken opponent of all kinds of social constructivism and postmodern relativism. As social scientist and economist he is strongly influenced by John Maynard Keynes and Hyman Minsky. He is the author of Social Choice, Value and Exploitation: an Economic-Philosophical Critique (in Swedish, 1991), Utility Theory and Structural Analysis (1997), Economic Theory and Method: A Critical Realist Perspective (in Swedish, 2001), The Dismal Science (in Swedish, 2001), The History of Economic Theories (in Swedish, 4th ed., 2007), John Maynard Keynes (in Swedish, 2007), An Outline of the History of Economics (in Swedish, 2011), as well as numerous articles in scientific journals.

World Economics Association Books

Is there anything worth keeping in mainstream microeconomics?

14 Apr, 2015 at 12:48 | Posted in Economics | 1 Comment

The main reason why the teaching of microeconomics (or of “ micro foundations” of macroeconomics) has been called “autistic” is because it is increasingly impossible to discuss real-world economic questions with microeconomists – and with almost all neoclassical theorists. They are trapped in their system, and don’t in fact care about the outside world any more. If you consult any microeconomic textbook, it is full of maths (e.g. Kreps or Mas-Colell, Whinston and Green) or of “tales” (e.g. Varian or Schotter), without real data (occasionally you find “examples”, or “applications”, with numerical examples – but they are purely fictitious, invented by the authors).

an-inconvenient-truth1At first, French students got quite a lot of support from teachers and professors: hundreds of teachers signed petitions backing their movement – specially pleading for “pluralism” in teaching the different ways of approaching economics. But when the students proposed a precise program of studies … almost all teachers refused, considering that is was “too much” because “students must learn all these things, even with some mathematical details”. When you ask them “why?”, the answer usually goes something like this: “Well, even if we, personally, never use the kind of ‘theory’ or ‘tools’ taught in micoreconomics Courses … surely there are people who do ‘use’ and ‘apply’ them, even if it is in an ‘unrealistic’, or ‘excessive’ way”.

But when you ask those scholars who do “use these tools”, especially those who do a lot of econometrics with “representative agent” models, they answer (if you insist quite a bit): “OK, I agree with you that it is nonsense to represent the whole economy by the (intertemporal) choice of one agent – consumer and producer – or by a unique household that owns a unique firm; but if you don’t do that, you don’t do anything !”

Bernard Guerrien

Yes indeed — “you don’t do anything!”

Twenty years ago Phil Mirowski was invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defense. Being at a nonplus, one of them, in total desperation, finally asked “But what shall we do then?”

Yes indeed — what shall they do? The emperor turned out to be naked.

[h/t Edward Fullbrook]

Does big government help or hurt?

14 Apr, 2015 at 09:44 | Posted in Economics | Comments Off on Does big government help or hurt?


Mastering ‘metrics

13 Apr, 2015 at 14:41 | Posted in Economics | 4 Comments

In their new book, Mastering ‘Metrics: The Path from Cause to Effect, Joshua D. Angrist and Jörn-Steffen Pischke write:

masteringOur first line of attack on the causality problem is a randomized experiment, often called a randomized trial. In a randomized trial, researchers change the causal variables of interest … for a group selected using something like a coin toss. By changing circumstances randomly, we make it highly likely that the variable of interest is unrelated to the many other factors determining the outcomes we want to study. Random assignment isn’t the same as holding everything else fixed, but it has the same effect. Random manipulation makes other things equal hold on average across the groups that did and did not experience manipulation. As we explain … ‘on average’ is usually good enough.

Angrist and Pischke may “dream of the trials we’d like to do” and consider “the notion of an ideal experiment” something that “disciplines our approach to econometric research,” but to maintain that ‘on average’ is “usually good enough” is an allegation that in my view is rather unwarranted, and for many reasons.

First of all it amounts to nothing but hand waving to simpliciter assume, without argumentation, that it is tenable to treat social agents and relations as homogeneous and interchangeable entities.

notes7-2Randomization is used to basically allow the econometrician to treat the population as consisting of interchangeable and homogeneous groups (‘treatment’ and ‘control’). The regression models one arrives at  by using randomized trials tell us the average effect that variations in variable X has on the outcome variable Y, without having to explicitly control for effects of other explanatory variables R, S, T, etc., etc. Everything is assumed to be essentially equal except the values taken by variable X.

In a usual regression context one would apply an ordinary least squares estimator (OLS) in trying to get an unbiased and consistent estimate:

Y = α + βX + ε,

where α is a constant intercept, β a constant “structural” causal effect and ε an error term.

The problem here is that although we may get an estimate of the “true” average causal effect, this may “mask” important heterogeneous effects of a causal nature. Although we get the right answer of the average causal effect being 0, those who are “treated”( X=1) may have causal effects equal to – 100 and those “not treated” (X=0) may have causal effects equal to 100. Contemplating being treated or not, most people would probably be interested in knowing about this underlying heterogeneity and would not consider the OLS average effect particularly enlightening.

Limiting model assumptions in economic science always have to be closely examined since if we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we “export” them to our “target systems”, we have to be able to show that they do not only hold under ceteris paribus conditions and a fortiori only are of limited value to our understanding, explanations or predictions of real economic systems.

Real world social systems are not governed by stable causal mechanisms or capacities. The kinds of “laws” and relations that econometrics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of mainstream economic theoretical modeling – rather useless.

Remember that a model is not the truth. It is a lie to help you get your point across. And in the case of modeling economic risk, your model is a lie about others, who are probably lying themselves. And what’s worse than a simple lie? A complicated lie.

Sam L. Savage The Flaw of Averages

When Joshua Angrist and Jörn-Steffen Pischke in an earlier article of theirs [“The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics,” Journal of Economic Perspectives, 2010] say that

anyone who makes a living out of data analysis probably believes that heterogeneity is limited enough that the well-understood past can be informative about the future

I really think they underestimate the heterogeneity problem. It does not just turn up as an external validity problem when trying to “export” regression results to different times or different target populations. It is also often an internal problem to the millions of regression estimates that economists produce every year.

But when the randomization is purposeful, a whole new set of issues arises — experimental contamination — which is much more serious with human subjects in a social system than with chemicals mixed in beakers … Anyone who designs an experiment in economics would do well to anticipate the inevitable barrage of questions regarding the valid transference of things learned in the lab (one value of z) into the real world (a different value of z) …

randomizeAbsent observation of the interactive compounding effects z, what is estimated is some kind of average treatment effect which is called by Imbens and Angrist (1994) a “Local Average Treatment Effect,” which is a little like the lawyer who explained that when he was a young man he lost many cases he should have won but as he grew older he won many that he should have lost, so that on the average justice was done. In other words, if you act as if the treatment effect is a random variable by substituting βt for β0 + β′zt, the notation inappropriately relieves you of the heavy burden of considering what are the interactive confounders and finding some way to measure them …

If little thought has gone into identifying these possible confounders, it seems probable that little thought will be given to the limited applicability of the results in other settings.

Ed Leamer

Evidence-based theories and policies are highly valued nowadays. Randomization is supposed to control for bias from unknown confounders. The received opinion is that evidence based on randomized experiments therefore is the best.

More and more economists have also lately come to advocate randomization as the principal method for ensuring being able to make valid causal inferences.

I would however rather argue that randomization, just as econometrics, promises more than it can deliver, basically because it requires assumptions that in practice are not possible to maintain.

Especially when it comes to questions of causality, randomization is nowadays considered some kind of “gold standard”. Everything has to be evidence-based, and the evidence has to come from randomized experiments.

But just as econometrics, randomization is basically a deductive method. Given the assumptions (such as manipulability, transitivity, separability, additivity, linearity, etc.) these methods deliver deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. And although randomization may contribute to controlling for confounding, it does not guarantee it, since genuine ramdomness presupposes infinite experimentation and we know all real experimentation is finite. And even if randomization may help to establish average causal effects, it says nothing of individual effects unless homogeneity is added to the list of assumptions. Real target systems are seldom epistemically isomorphic to our axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of the conclusions reached from within these epistemically convenient models/systems. Causal evidence generated by randomization procedures may be valid in “closed” models, but what we usually are interested in, is causal evidence in the real target system we happen to live in.

When does a conclusion established in population X hold for target population Y? Only under very restrictive conditions!

Angrist’s and Pischke’s “ideally controlled experiments” tell us with certainty what causes what effects — but only given the right “closures”. Making appropriate extrapolations from (ideal, accidental, natural or quasi) experiments to different settings, populations or target systems, is not easy. “It works there” is no evidence for “it will work here”. Causes deduced in an experimental setting still have to show that they come with an export-warrant to the target population/system. The causal background assumptions made have to be justified, and without licenses to export, the value of “rigorous” and “precise” methods — and ‘on-average-knowledge’ — is despairingly small.

The cleavage that counts

13 Apr, 2015 at 12:55 | Posted in Economics | 3 Comments

On the one side were those who believed that the existing economic system is in the long run self-adjusting, though with creaks and groans and jerks, and interrupted by time-lags, outside interference and mistakes … These economists did not, of course, believe that the system is automatic or immediately self-adjusting, but they did maintain that it has an inherent tendency towards self-adjustment, if it is not interfered with, and if the action of change and chance is not too rapid.

John Maynard KeynesThose on the other side of the gulf, however, rejected the idea that the existing economic system is, in any significant sense, self-adjusting. They believed that the failure of effective demand to reach the full potentialities of supply, in spite of human psychological demand being immensely far from satisfied for the vast majority of individuals, is due to much more fundamental causes …

The gulf between these two schools of thought is deeper, I believe, than most of those on either side of it realize. On which side does the essential truth lie?

The strength of the self-adjusting school depends on its having behind it almost the whole body of organized economic thinking and doctrine of the last hundred years. This is a formidable power … It has vast prestige and a more far-reaching influence than is obvious. For it lies behind the education and the habitual modes of thought, not only of economists but of bankers and business men and civil servants and politicians of all parties …

Now I range myself with the heretics. I believe their flair and their instinct move them towards the right conclusion. But I was brought up in the citadel and I recognize its power and might … For me, therefore, it is impossible to rest satisfied until I can put my finger on the flaw in the part of the orthodox reasoning that leads to the conclusions that for various reasons seem to me to be inacceptable. I believe that I am on my way to do so. There is, I am convinced, a fatal flaw in that part of the orthodox reasoning that deals with the theory of what determines the level of effective demand and the volume of aggregate employment …

John Maynard Keynes (1934)

Balance sheet recessions — a massive case of fallacy of composition problems

11 Apr, 2015 at 17:43 | Posted in Economics | 3 Comments


The way I understand Richard Koo, he maintains that interest rates and monetary policy don’t really matter when we’re in a balance sheet recession where, following on a nationwide collapse in asset prices, more or less every company and household find themselves carrying excess debt and have to pay down debt. The number of willing private borrowers is strongly reduced – even when interest rates are at zero – and as a result of this “debt minimization” monetary policy by itself therefore loses all power. To get things going, the government has to run a fiscal deficit,  by increasing borrowing producing an increase in money supply and thereby making monetary policy work.

Paul Krugman had a post up earlier this year, basically maintaining that this argument can’t be right, since if there are some people – debtors – in the balance sheet recession that pay down their debt, there also have to be other people – creditors – that a fortiori strengthen their balance sheets, and who are susceptible to being influenced by what happens to interest rates and inflation.

To be honest, I have some problems seeing the great gulf between them – at least on the level of general principles – that one is lead to believe ought to be there, considering all the heated discussion there has been on this issue between them for a couple of years now.

For although it’s true, as Koo says, for those firms that try to minimize debt, no injections what so ever that the central bank makes will generate inflationary impulses. For others – and probably not even in the worst balance sheet recessions imaginable are all firms debt-constrained – there might be room for some (limited) inflationary generation by monetary means. So ultimately, it looks like more of a differences in degree rather than in kind. To Koo monetary policy has by itself no power, and instead we have to put our trust in fiscal policy. Krugman on the other hand says that some private actors might not be  balance sheet-constrained and therefore susceptible to (inflationary) monetary policy, and that besides fiscal policy anyway can work. And more importantly – both definitely agree that increased liquidity will not not always and  everywhere get the economy out of a slump, and that neither fiscal, nor monetary policy, in itself is capable of solving the problems created in a balance sheet recession.

Market fundamentalist ideologies

11 Apr, 2015 at 17:19 | Posted in Economics | Comments Off on Market fundamentalist ideologies


On the irrelevance of general equilibrium theory

11 Apr, 2015 at 11:18 | Posted in Economics | 1 Comment

The general equilibrium approach starts with individual decisions. It assumes that trades are voluntary and that there exist mutually advantageous opportunities of exchange. Up to here, everyone can agree. The problem lies in the next step. At this point, let us folllow David Kreps’s (1990) reasoning in his A Course in Microeconomic Theory. Kreps asks the reader to “imagine consumers wandering around a large market square” with different kinds of food in their bags. When two of them meet, “they examine what each has to offer, to see if they can arrange a mutually agreeable trade. To be precise, we might imagine that at every chance meeting of this sort, the two flip a coin and depending on the outcome, one is allowed to propose an exchange, which the other may either accept or reject. The rule is that you can’t eat until you leave the market square, so consumers wait until they are sat- isfied with what they possess” (196).

Kreps “imagines” other models of this kind. In each of them by the word “market” he means a “market square,” and he introduces rules (“flip a coin,” “nobody can leave before the end of the process”). He is aware that “exploration of more realistic models of markets is in relative infancy.” And when he speaks of “more realistic” models, he means more realistic with respect to perfect competition.

_files_2012_05_Foreclosure-MythsBut the problem with perfect competition is not its “lack” of realism; it is its “irrelevancy” as it surreptitiously assumes an entity that gives prices (present and future) to price taking agents, that collects information about supplies and demands, adds these up, moves prices up and down until it finds their equilibrium value. Textbooks do not tell this story; they assume that a deus ex machina called the “market” does the job.

Sorry, but we do not want to teach these absurdities. In the real world, people trade with each other, not with “the market.” And some of them, at least, are price makers. To make things worse, textbooks generally allude to some mysterious “invisible hand” that allocates goods optimally. They wrongly attribute this idea to Adam Smith and make use of his authority so that students accept this magical way of thinking as a kind of proof.

Perfect competition in the general equilibrium mode is perhaps an interesting model for describing a central planner who is trying to find an efficient allocation of resources using prices as signals that guide price taker households and firms. But students should be told that the course they follow—on “general competitive analysis”—is irrelevant for understanding market economies.

Emmanuelle Benicourt & Bernard Guerrien

I can’t but agree with these two eminent French mathematical economists. You could, of course, as Brad DeLong has asserted, consider modern neoclassical economics to be in fine shape “as long as it is understood as the ideological and substantive legitimating doctrine of the political theory of possessive individualism” and you manage to put a blind eye to all the caveats to its general equilibrium models — markets must be in equilibrium and competitive, the goods traded must be excludable and non-rival, etc, etc. The list of caveats soon becomes impressively large — and not very much value is left of “modern neoclassical economics” if you ask me …

what ifStill — almost a century and a half after Léon Walras founded neoclassical general equilibrium theory — “modern neoclassical economics” hasn’t been able to show that markets move economies to equilibria.

We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. One however has to ask oneself — what good does that do?

As long as we cannot show, except under exceedingly special assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria — the value of general equilibrium theory is negligible. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming “Santa Claus” conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids, and general equilibrium economists ought to grow up.

Continuing to model a world full of agents behaving as economists — “often wrong, but never uncertain” — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away) is a gross misallocation of intellectual resources and time.

The Bernanke-Summers imbroglio

10 Apr, 2015 at 18:24 | Posted in Economics | 6 Comments

As no one interested in macroeconomics has failed to notice, Ben Bernanke is having a debate with Larry Summers on what’s behind the slow recovery of growth rates since the financial crisis of 2007.

To Bernanke it’s basically a question of a savings glut.

To Summers it’s basically a question of a secular decline in the level of investment.

To me the debate is actually a non-starter, since they both rely on a loanable funds theory and a Wicksellian notion of a “natural” rate of interest — ideas that have been known to be dead wrong for at least 80 years …

Let’s start with the Wicksellian connection and consider what Keynes wrote in General Theory:

In my Treatise on Money I defined what purported to be a unique rate of interest, which I called the natural rate of interest, namely, the rate of interest which, in the terminology of my Treatise, preserved equality between the rate of saving (as there defined) and the rate of investment. I believed this to be a development and clarification of Wicksell’s ‘natural rate of interest’, which was, according to him, the rate which would preserve the stability of some, not quite clearly specified, price-level.

I had, however, overlooked the fact that in any given society there is, on this definition, a different natural rate of interest for each hypothetical level of employment. And, similarly, for every rate of interest there is a level of employment for which that rate is the ‘natural’ rate, in the sense that the system will be in equilibrium with that rate of interest and that level of employment. Thus it was a mistake to speak of the natural rate of interest or to suggest that the above definition would yield a unique value for the rate of interest irrespective of the level of employment. I had not then understood that, in certain conditions, the system could be in equilibrium with less than full employment.

I am now no longer of the opinion that the [Wicksellian] concept of a ‘natural’ rate of interest, which previously seemed to me a most promising idea, has anything very useful or significant to contribute to our analysis. It is merely the rate of interest which will preserve the status quo; and, in general, we have no predominant interest in the status quo as such.

And when it comes to the loanable funds theory, this is really in many regards nothing but an approach where the ruling rate of interest in society is — pure and simple — conceived as nothing else than the price of loans or credit, determined by supply and demand — as Bertil Ohlin put it — “in the same way as the price of eggs and strawberries on a village market.”

loanIn the traditional loanable funds theory — as presented in mainstream macroeconomics textbooks  — the amount of loans and credit available for financing investment is constrained by how much saving is available. Saving is the supply of loanable funds, investment is the demand for loanable funds and assumed to be negatively related to the interest rate. Lowering households’ consumption means increasing savings that via a lower interest.

From a more Post-Keynesian-Minskyite point of view the problems with the standard presentation and formalization of the loanable funds theory are quite obvious.

As already noticed by James Meade decades ago, the causal story told to explicate the accounting identities used gives the picture of “a dog called saving wagged its tail labelled investment.” In Keynes’s view — and later over and over again confirmed by empirical research — it’s not so much the interest rate at which firms can borrow that causally determines the amount of investment undertaken, but rather their internal funds, profit expectations and capacity utilization.

As is typical of most mainstream macroeconomic formalizations and models, there is pretty little mention of real world phenomena, like e. g. real money, credit rationing and the existence of multiple interest rates, in the loanable funds theory. Loanable funds theory essentially reduces modern monetary economies to something akin to barter systems — something it definitely is not. As emphasized especially by Minsky, to understand and explain how much investment/loaning/crediting is going on in an economy, it’s much more important to focus on the working of financial markets than staring at accounting identities like S = Y – C – G. The problems we meet on modern markets today have more to do with inadequate financial institutions than with the size of loanable-funds-savings.

The loanable funds theory means that the interest rate is endogenized by assuming that Central Banks can (try to) adjust it in response to an eventual output gap. This, of course, is essentially nothing but an assumption of Walras’ law being valid and applicable, and that a fortiori the attainment of equilibrium is secured by the Central Banks’ interest rate adjustments. From a realist Keynes-Minsky point of view this can’t be considered anything else than a belief resting on nothing but sheer hope. [Not to mention that more and more Central Banks actually choose not to follow Taylor-like policy rules.] The age-old belief that Central Banks control the money supply has more an more come to be questioned and replaced by an “endogenous” money view, and I think the same will happen to the view that Central Banks determine “the” rate of interest.

A further problem in the traditional loanable funds theory is that it assumes that saving and investment can be treated as independent entities. To Keynes this was seriously wrong:

gtThe classical theory of the rate of interest [the loanable funds theory] seems to suppose that, if the demand curve for capital shifts or if the curve relating the rate of interest to the amounts saved out of a given income shifts or if both these curves shift, the new rate of interest will be given by the point of intersection of the new positions of the two curves. But this is a nonsense theory. For the assumption that income is constant is inconsistent with the assumption that these two curves can shift independently of one another. If either of them shift, then, in general, income will change; with the result that the whole schematism based on the assumption of a given income breaks down … In truth, the classical theory has not been alive to the relevance of changes in the level of income or to the possibility of the level of income being actually a function of the rate of the investment.

There are always (at least) two parts in an economic transaction. Savers and investors have different liquidity preferences and face different choices — and their interactions usually only take place intermediated by financial institutions. This, importantly, also means that there is no “direct and immediate” automatic interest mechanism at work in modern monetary economies. What this ultimately boils done to is — iter — that what happens at the microeconomic level — both in and out of equilibrium —  is not always compatible with the macroeconomic outcome. The fallacy of composition (the “atomistic fallacy” of Keynes) has many faces — loanable funds is one of them.

Contrary to the loanable funds theory, finance in the world of Keynes and Minsky precedes investment and saving. Highlighting the loanable funds fallacy, Keynes wrote in “The Process of Capital Formation” (1939):

Increased investment will always be accompanied by increased saving, but it can never be preceded by it. Dishoarding and credit expansion provides not an alternative to increased saving, but a necessary preparation for it. It is the parent, not the twin, of increased saving.

So, in way of conclusion, what I think both Bernanke and Summers “forget” when they hold to the loanable funds theory and the Wicksellian concept of a “natural” rate of interest, is the Keynes-Minsky wisdom of truly acknowledging that finance — in all its different shapes — has its own dimension, and if taken seriously, its effect on an analysis must modify the whole theoretical system and not just be added as an unsystematic appendage. Finance is fundamental to our understanding of modern economies, and acting like the baker’s apprentice who, having forgotten to add yeast to the dough, throws it into the oven afterwards, simply isn’t enough.

I may be too bold, but I’m willing to take the risk, and so recommend both Bernanke and Summers to make the following addition to their reading lists …

It should be emphasized that the equality between savings and investment … will be valid under all circumstances.kalecki In particular, it will be independent of the level of the rate of interest which was customarily considered in economic theory to be the factor equilibrating the demand for and supply of new capital. In the present conception investment, once carried out, automatically provides the savings necessary to finance it. Indeed, in our simplified model, profits in a given period are the direct outcome of capitalists’ consumption and investment in that period. If investment increases by a certain amount, savings out of profits are pro tanto higher …

One important consequence of the above is that the rate of interest cannot be determined by the demand for and supply of new capital because investment ‘finances itself.’

Nicholas Kaldor on putting the cart before the horse fallacy

10 Apr, 2015 at 15:15 | Posted in Economics | 2 Comments

Foreseeing the future is difficult. But sometimes it seems as though some people get it terribly right …

Some day the nations of Europe may be ready to merge their national identities and create a new European Union – the United States of Europe. If and when they do, a European Government will take over all the functions which the Federal government now provides in the U.S., or in Canada or Australia. This will involve the creation of a “full economic and monetary union”. But it is a dangerous error to believe that monetary and economic union can precede a political union or that it will act (in the words of the Werner report) “as a leaven for the evolvement of a political union which in the long run it will in any case be unable to do without”. For if the creation of a monetary union and Community control over national budgets generates pressures which lead to a breakdown of the whole system it will prevent the development of a political union, not promote it.

Nicholas Kaldor (1971)

« Previous PageNext Page »

Blog at
Entries and comments feeds.