Friedman’s response to Romer & Romer

29 February, 2016 at 18:27 | Posted in Economics | 3 Comments

As yours truly wrote the other day, reading the different reactions, critiques and ‘analyses’ of Gerald Friedman’s calculations on the long term effects of implementing the Sanders’ program, the whole issue seems to basically burn down to if the Verdoorn law is operative or not.

In Friedman’s response to Romer & Romer today this is made even clearer than in the original Friedman analysis:

The Romers … would acknowledge that following a negative shock, government stimulus spending may accelerate the recovery somewhat …They deny, however, that stimulus spending could change the permanent level of output … Like mosquitos on an otherwise delightful summer afternoon, slow growth is unfortunate but there is little that can safely be done about it.

slide_40Or maybe we can find safe pesticides. Here I agree with John Maynard Keynes that the economy can have a low-employment equilibrium because of a lack of effective demand, and I agree with Nicholas Kaldor and Petrus Verdoorn that productivity and the growth rate of capacity can be increased by policies that push the economy to a higher level of employment … I see an economy at low-employment equilibrium where discouraged workers have abandoned the labor market and firms have had little incentive to innovate or to raise productivity. In this situation, additional stimulus can not only temporarily raise output but by priming the pump and encouraging additional private spending and investment, it can push the economy upwards towards capacity. And, beyond because at higher levels of employment, more people will look for work, more businesses will invest, and employment will grow faster and productivity will rise pushing up the growth rate in capacity. That is why I see lasting effects from a government stimulus when, as now, the economy is in a low-employment equilibrium.


Is 0.999 … = 1? (wonkish)

29 February, 2016 at 12:52 | Posted in Statistics & Econometrics | 8 Comments

What is 0.999 …, really? Is it 1? Or is it some number infinitesimally less than 1?

The right answer is to unmask the question. What is 0.999 …, really? It appears to refer to a kind of sum:

.9 + + 0.09 + 0.009 + 0.0009 + …

9781594205224M1401819961But what does that mean? That pesky ellipsis is the real problem. There can be no controversy about what it means to add up two, or three, or a hundred numbers. But infinitely many? That’s a different story. In the real world, you can never have infinitely many heaps. What’s the numerical value of an infinite sum? It doesn’t have one — until we give it one. That was the great innovation of Augustin-Louis Cauchy, who introduced the notion of limit into calculus in the 1820s.

The British number theorist G. H. Hardy … explains it best: “It is broadly true to say that mathematicians before Cauchy asked not, ‘How shall we define 1 – 1 – 1 + 1 – 1 …’ but ‘What is 1 -1 + 1 – 1 + …?'”

No matter how tight a cordon we draw around the number 1, the sum will eventually, after some finite number of steps, penetrate it, and never leave. Under those circumstances, Cauchy said, we should simply define the value of the infinite sum to be 1.

I have no problem with solving problems in mathematics by ‘defining’ them away. But how about the real world? Maybe that ought to be a question to ponder even for economists all to fond of uncritically following the mathematical way when applying their mathematical models to the real world, where indeed “you can never have infinitely many heaps” …

In econometrics we often run into the ‘Cauchy logic’ —the data is treated as if it were from a larger population, a ‘superpopulation’ where repeated realizations of the data are imagined. Just imagine there could be more worlds than the one we live in and the problem is fixed …

Accepting Haavelmo’s domain of probability theory and sample space of infinite populations – just as Fisher’s “hypothetical infinite population, of which the actual data are regarded as constituting a random sample”, von Mises’s “collective” or Gibbs’s ”ensemble” – also implies that judgments are made on the basis of observations that are actually never made!

Infinitely repeated trials or samplings never take place in the real world. So that cannot be a sound inductive basis for a science with aspirations of explaining real-world socio-economic processes, structures or events. It’s — just as the Cauchy mathematical logic of ‘defining’ away problems — not tenable.

In social sciences — including economics — it’s always wise to ponder C. S. Peirce’s remark that universes are not as common as peanuts …

Transitivity — just another questionable assumption

29 February, 2016 at 10:33 | Posted in Economics | 1 Comment

My doctor once recommended I take niacin for the sake of my heart. Yours probably has too, unless you’re a teenager or a marathon runner or a member of some other metabolically privileged caste. Here’s the argument: Consumption of niacin is correlated with higher levels of HDL, or “good cholesterol,” and high HDL is correlated with lower risk of “cardiovascular events.” If you’re not a native speaker of medicalese, that means people with plenty of good cholesterol are less likely on average to clutch their hearts and keel over dead.

But a large-scale trial carried out by the National Heart, Lung, and Blood Institute was halted in 2011, a year and a half before the scheduled finish, because the results were so weak it didn’t seem worth it to continue. Patients who got niacin did indeed have higher HDL levels, but they had just as many heart attacks and strokes as everybody else.

rockPaperScisssor How can this be? Because correlation isn’t transitive. That is: Just because niacin is correlated with HDL, and high HDL is correlated with low risk of heart disease, you can’t conclude that niacin is correlated with low risk of heart disease.

Transitive relations are ones like “weighs more than.” If I weigh more than my son and my son weighs more than my daughter, it’s an absolute certainty that I weigh more than my daughter. “Lives in the same city as” is transitive, too—if I live in the same city as Bill, who lives in the same city as Bob, then I live in the same city as Bob.

But many of the most interesting relations we find in the world of data aren’t transitive. Correlation, for instance, is more like “blood relation.” I’m related to my son, who’s related to my wife, but my wife and I aren’t blood relatives. In fact, it’s not a terrible idea to think of correlated variables as “sharing part of their DNA.” Suppose I run a boutique money management firm with just three investors, Laura, Sara, and Tim. Their stock positions are pretty simple: Laura’s fund is split 50–50 between Facebook and Google, Tim’s is one-half General Motors and one-half Honda, and Sara, poised between old economy and new, goes one-half Honda, one-half Facebook. It’s pretty obvious that Laura’s returns will be positively correlated with Sara’s; they have half their portfolio in common. And the correlation between Sara’s returns and Tim’s will be equally strong. But there’s no reason (except insofar as the whole stock market tends to move in concert) to think Tim’s performance has to be correlated with Laura’s. Those two funds are like the parents, each contributing one-half of their “genetic material” to form Sara’s hybrid fund.

Jordan Ellenberg

Statistics — a question of life and death

29 February, 2016 at 10:19 | Posted in Statistics & Econometrics | 1 Comment

In 1997, Christopher, the eleven-week-old child of a young lawyer named Sally Clark, died in his sleep: an apparent case of Sudden Infant Death Sybdrome (SIDS) … One year later, Sally’s second child, Harry, also died, aged just eight weeks. Sally was arrested and accused of killing the children. She was convicted of murdering them, and in 1999 was given a life sentence …

71saNqrmn1L._SL1500_Now … I want to show how a simple mistaken assumption led to incorrect probabilities.

In this case the mistaken evidence came from Sir Roy Meadow, a paediatrician. Despite not being an expert statistician or probabilist, he felt able to make a statement about probabilities … He asserted that the probability of two SIDS deaths in a family like Sally Clark’s was 1 in 73 million. A probability as small as this suggests we might apply Borel’s law: we shouldn’t expect to see an improbable event …

Unfortunately, however, Meadow’s 1 in 73 million probability is based on a crucial assumption: that the deaths are independent; that one such death in a family does not make it more or less likely that there will be another …

Now … that assumption does seem unjustified: data show that if one SIDS death has occurred, then a subsequent child is about ten times more likely to die of SIDS … To arrive at a valid conclusion, we would have to compare the probability that the two children had been murdered with the probability that they had both died from SIDS … There is a factor-of-ten differeence between Meadow’s estimate and the estimate based on recognizing that SIDS events in the same family are not independent, and that difference shifts the probability from favouring homicide to favouring SIDS deaths …

Following widespread criticism of the misuse and indeed misunderstanding of statistical evidence, Sally Clark’s conviction was overturned, and she was released in 2003.


28 February, 2016 at 23:52 | Posted in Varia | Comments Off on Tina


A guide to econometrics

28 February, 2016 at 10:37 | Posted in Statistics & Econometrics | Comments Off on A guide to econometrics

kennedyguide1. Thou shalt use common sense and economic theory.
2. Thou shalt ask the right question.
3. Thou shalt know the context.
4. Thou shalt inspect the data.
5. Thou shalt not worship complexity.
6. Thou shalt look long and hard at thy results.
7. Thou shalt beware the costs of data mining.
8. Thou shalt be willing to compromise.
9. Thou shalt not confuse statistical significance with substance.
10. Thou shalt confess in the presence of sensitivity.

Bernie Sanders and the Verdoorn law

27 February, 2016 at 16:54 | Posted in Economics | 12 Comments

Reading the different reactions, critiques and ‘analyses’ of Gerald Friedman’s calculations on the long term effects of implementing the Sanders’ program, it seem to me that what it basically burns down to is if the Verdoorn law is operative or not.

Estimating the impact of Sanders’ program Friedman writes (p. 13):

Higher demand for labor is also associated with an increase in labor productivity and this accounts for about half of the increase in economic growth under the Sanders program.

Obviously, that’s a view that  Christina Romer and David Romer (p. 8) don’t share:

Friedman … argues that as demand expansion raised output, endogenous productivity growth … would raise productive capacity by enough to prevent it from constraining output … The evidence that productivity growth would surge as a result of a demand-driven boom is weak. The fact that there is a correlation between output growth and productivity growth is not surprising. Periods of rapid productivity growth, such as the 1990s, are naturally also periods of rapid output growth. But this does not tell us that an extended period of rapid output growth resulting from demand stimulus would cause sustained high productivity growth …

In the standard mainstream economic analysis, a demand expansion may very well raise measured productivity — in the short run. But in the long run, expansionary demand policy measures cannot lead to sustained higher productivity and output levels.

verdoornIn some non-standard heterodox analyses, however, labour productivity growth is often described as a function of output growth. The rate of technical progress varies directly with the rate of growth according to the Verdoorn law. Growth and productivity is in this view highly demand-determined not only in the short run but also in the long run.

Given that the Verdoorn law is operative, Sanders’ policy could actually lead to increases in productivity and growth. Living in a world permeated by genuine Keynes-type uncertainty, we can, of course, not with any greater precision forecast how great those effects would be.

So, the nodal point is — has the Verdoorn Law been validated or not in empirical studies?

There have been hundreds of studies that have tried to answer that question, and as could be imagined, the answers differ. The law has been investigated with different econometric methods (time-series, IV, OLS, ECM, cointegration, etc.). The statistical and econometric problems are enormous (especially when it comes to the question, highlighted by Romer & Romer, on the direction of causality). Given this, however, most studies on the country level do confirm that the Verdoorn law holds — United States included. Most of the studies are for the period before the subprime crisis of 2006/2007, but if anything, it is more in line with Friedman than Romer & Romer.

Oh, dear.

Bayesianism — an unacceptable scientific reasoning

26 February, 2016 at 16:14 | Posted in Theory of Science & Methodology | 5 Comments

9780702249631 A major, and notorious, problem with this approach, at least in the domain of science, concerns how to ascribe objective prior probabilities to hypotheses. What seems to be necessary is that we list all the possible hypotheses in some domain and distribute probabilities among them, perhaps ascribing the same probability to each employing the principal of indifference. But where is such a list to come from? It might well be thought that the number of possible hypotheses in any domain is infinite, which would yield zero for the probability of each and the Bayesian game cannot get started. All theories have zero
probability and Popper wins the day. How is some finite list of hypotheses enabling some objective distribution of nonzero prior probabilities to be arrived at? My own view is that this problem is insuperable, and I also get the impression from the current literature that most Bayesians are themselves
coming around to this point of view.

Chalmers is absolutely right here in his critique of ‘objective’ Bayesianism, but I think it could actually be extended to also encompass its ‘subjective’ variety.

A classic example — borrowed from Bertrand Russell — may perhaps be allowed to illustrate the main point of the critique:

Assume you’re a Bayesian turkey and hold a nonzero probability belief in the hypothesis H that “people are nice vegetarians that do not eat turkeys and that every day I see the sun rise confirms my belief.” For every day you survive, you update your belief according to Bayes’ theorem

P(H|e) = [P(e|H)P(H)]/P(e),

where evidence e stands for “not being eaten” and P(e|H) = 1. Given that there do exist other hypotheses than H, P(e) is less than 1 and a fortiori P(H|e) is greater than P(H). Every day you survive increases your probability belief that you will not be eaten. This is totally rational according to the Bayesian definition of rationality. Unfortunately, for every day that goes by, the traditional Christmas dinner also gets closer and closer …

The nodal point here is — of course — that although Bayes’ theorem is mathematically unquestionable, that doesn’t qualify it as indisputably applicable to scientific questions.

Bayesian probability calculus is far from the automatic inference engine that its protagonists maintain it is. Where do the priors come from? Wouldn’t it be better in science if we did some scientific experimentation and observation if we are uncertain, rather than starting to make calculations based on people’s often vague and subjective personal beliefs? Is it, from an epistemological point of view, really credible to think that the Bayesian probability calculus makes it possible to somehow fully assess people’s subjective beliefs? And are — as Bayesians maintain — all scientific controversies and disagreements really possible to explain in terms of differences in prior probabilities? I’ll be dipped!

Making sense of data — categorical models

26 February, 2016 at 13:24 | Posted in Economics, Statistics & Econometrics | Comments Off on Making sense of data — categorical models

Great lecture by one of my favourite lecturers — Scott Page.

Olof Palme In Memoriam

24 February, 2016 at 15:27 | Posted in Politics & Society | 1 Comment

Olof Palme.

Born in January 1927.

Murdered in February 1986.

30 years and a loss my country — Sweden — is still suffering from.

Macroeconomics on a walk down a blind alley

24 February, 2016 at 10:17 | Posted in Economics | Comments Off on Macroeconomics on a walk down a blind alley

I would say that people like Kydland and Prescott, and so forth, people like that … changed the way that people do macroeconomics. But in my view it was not a positive change … One predominant idea is that of external shocks—and in particular the idea that the shocks that happen to the economy should essentially be the technological shocks. As Joe Stiglitz said, what could we mean by a negative technological shock? That people forget what they could do before?

So we have this idea that we have a system which is in equilibrium and that every now and then it gets knocked off the equilibrium by ‘a shock’. But shocks are part of the system! We have gone down a track that actually does not allow us to say much about the real, major movements in the macro-economy … We should be studying non-normal periods, instead of normal ones, because that is what causes real problems. And we do not do that.

unemployed-thumbSo my vision of the state of macroeconomics is that it somehow has the wrong view: an equilibrium view and a stationary state view. But what is important and interesting about macroeconomics is precisely when those two things do not hold. How can you talk of equilibrium when we move from 5% unemployment to 10% unemployment? If you are in Chicago, you say “Well, those extra 5% have made the calculation that it was better for them to be out of work”. But look at the reality; that is not what happens. People do not want to be out of work … Millions of people are out of work, and we are not worried about that?

That is the major failure in macroeconomics. It does not address the serious problems that we face when we get out of equilibrium. And we are out of equilibrium most of the time.

Alan Kirman

Yours truly is extremely fond of economists like Alan Kirman. With razor-sharp intellects they immediately go for the essentials. They have no time for bullshit. And neither should we.

Economists — dangerous arrogants

23 February, 2016 at 12:39 | Posted in Economics | 1 Comment

In advanced economics the question would be: ‘What besides mathematics should be in an economics lecture?’ In physics the familiar spirit is Archimedes the experimenter. aaaaafeynBut in economics, as in mathematics itself, it is theorem-proving Euclid who paces the halls …

Economics … has become a mathematical game. The science has been drained out of economics, replaced by a Nintendo game of assumption-making …

Most thoughtful economists think that the games on the blackboard and the computer have gone too far, absurdly too far. It is time to bring economic observation, economic history, economic literature, back into the teaching of economics.

Economists would be less arrogant, and less dangerous as experts, if they had to face up to the facts of the world. Perhaps they would even become as modest as the physicists.

D. McCloskey

Krugman owes Friedman an apology!

22 February, 2016 at 19:19 | Posted in Politics & Society | 1 Comment


February 20, 2016

Dear Paul,
Your suggestion that “personal ambition” in any way influenced my analysis of the Sanders economic plan is as insulting as it is wrong and you owe me an apology.

You don’t know me. We did not quite overlap in graduate school and our paths have diverged since. We have never met or spoken. The closest we came was when my department attempted to bring you to Amherst to give a guest lecture. Never happened because we could not afford your rate.

While you don’t know me, you seem to feel free to speculate about my values and interests. You assume that an outsider economist like myself must be considered not particularly “insightful or even technically competent.” And, elaborating this theory, you conclude that envy would lead me to jump on an opportunity for self-advancement by shilling for an outsider politician. Now this theory might be tested empirically. You could easily have tested your theory by investigating my motives empirically. You could have called me and asked. Or you could have read any of the news stories where I explained how I stumbled on this research project, and where I explained my (lack of) connection to the Sanders Campaign …

Since you did not bother to do the empirical work: let me do it for you. I undertook this study from simple scholarly curiosity; I did it without any connection to the Sanders campaign; and I have no expectation of reward. I have no desire to be involved in a Sanders Administration. I am completely happy teaching at UMass-Amherst and have no wish for anything more in the world than to do my work where I am.

Finally, if I may point out another flaw in your envy-ambition model: why would the Sanders camp ever appoint someone who has publicly acknowledged that he donates to the Hillary Clinton campaign and is undecided about for whom to vote in the upcoming Massachusetts primary?

In its lack of empirical grounding, your column is like the CEA-chairs’ letter: substituting attack language and ad hominem argument for reasoned discourse … Rather than jumping on my conclusion, a more constructive discussion would focus on identifying possible errors in my method that may have led to conclusions that may seem implausible. Certainly, we can agree that it is illogical to reject conclusions without finding fault with method …

Best wishes,
Gerald Friedman
Professor of Economics at the University of Massachusetts at Amherst

[Naked Capitalism]


‘Mathematical’ economists and real mathematicians

22 February, 2016 at 17:19 | Posted in Economics | Comments Off on ‘Mathematical’ economists and real mathematicians

Years ago, I was involved in organising conferences with Christopher Zeeman … and we organized one between economists and mathematicians. We had some great mathematicians—John Milnor, Steve Smale, Rene Thom, and others—wonderful mathematicians. And on the other side, we had Gérard Debreu, Hugo Sonnenschein, Werner Hildenbrand, and a whole group of very distinguished mathematical economists.


After the first two, three hours, I think it was Milnor who said: “We all know that you guys can do mathematics, you do not have to show us. Everybody does his own thing. You want to show us that you are good at doing certain sorts of mathematics; that is fine. But we are interested in the economic problems. We thought that you were going to tell us about economic problems and we were going to use our mathematical tools to help you. But all you are telling us is the mathematical tools that you use and how you are doing well with them. But that is not going to create much”. I think that was absolutely right. After that, the economists were rather silenced and started shifting in their seats uncomfortably. Debreu never said very much anyway, but it was clear he was very insulted, because basically he liked to think of himself as a mathematician.

Alan Kirman

The limits of game theory

22 February, 2016 at 11:45 | Posted in Economics | 4 Comments

If you read Binmore’s Essays on the foundations of game theory (1990)
you will find a section where he says that, unfortunately, we get into a
kind of impasse. We get this infinite regress linked to the common
knowledge problem. For example, I drive frequently from Aix to
Marseille. You have the autoroute and parallel to it is the route
nationale. Say there is, one day, congestion on the autoroute and nobody
on the nationale. I think: “Tomorrow I will take the nationale. But, wait
a minute, these other drivers are intelligent too, so they will take the
nationale tomorrow, I would do better to stay over here. But, wait a
minute, these drivers are pretty intelligent so they can make that step
too…” It is actually not logically possible to reason to the solution of
these kinds of problems that people are supposed to be solving in game

untitledYou can surely define an equilibrium, and say that if we were there
nobody would want to move. But then you get to the problem of how we
get to this equilibrium—the exact same problem that we have with
general equilibrium …

For certain specific, local problems, game theory is a very nice way
of thinking about how people might try to solve them, but as soon as
you are dealing with a general problem like an economy or a market,
I think it is difficult to believe that there is full strategic interaction
going on. It is just asking too much of people. Game theory imposes a
huge amount of abstract reasoning on the part of people …

That is why I think game theory, as an approach to large scale interaction, is probably not the right way to go.

Alan Kirman

How could ‘testing axioms’ be controversial?

21 February, 2016 at 14:21 | Posted in Economics | 1 Comment

Of course the more immediate target of Davidson in his formulation of the argument in the early 1980s was not Samuelson, but Lucas and Sargent and their rational expectations hypothesis … This was indeed the period when new classical economics was riding at its highest point of prestige, with Lucas and Sargent and their rational expectations assumption apparently sweeping the boards of any sort of Keynesian theories.Curiously, they did not seem to care whether the assumption was actually true, because it was “an axiom,” something that is assumed and cannot be tested …

requirements-based-testing-13-728This matter of “testing axioms” is controversial. Davidson is right that Keynes was partly inspired by Einstein’s Theory of General Relativity that was based on a relaxation of the parallel axiom of Euclid. So, Davidson argued not unreasonably that he would also be inclined to wish to relax any ergodic axiom. However, of course, the rejection of the parallel postulate (or axiom) did come from empirical tests showing that it does not hold in space-time in general due to gravity curving it. So, the empirical testing of axioms is relevant, and the failure of the rational expectations axiom to hold empirically is certainly reasonable grounds for rejecting it.

J. Barkley Rosser Jr

On this Einstein and Keynes are of course absolutely right. Economics — in contradistinction to logic and mathematics — is an empirical science, and empirical testing of ‘axioms’ ought to be self-evidently relevant for such a discipline. For although the economist himself (implicitly) claims that his axiom is universally accepted as true and in now need of proof, that is in no way a justified reason for the rest of us to simpliciter accept the claim.

When applying deductivist thinking to economics, neoclassical economists usually set up “as if” models based on a set of tight axiomatic assumptions from which consistent and precise inferences are made. The beauty of this procedure is of course that if the axiomatic premises are true, the conclusions necessarily follow. The snag is that if the models are to be relevant, we also have to argue that their precision and rigour still holds when they are applied to real-world situations. They often don’t. When addressing real economies, the idealizations and abstractions necessary for the deductivist machinery to work simply don’t hold.

The logic of idealization is a marvellous tool in mathematics and axiomatic-deductivist systems, but a poor guide for real-world systems. As Hans Albert has it on the neoclassical style of thought:

hans_albertScience progresses through the gradual elimination of errors from a large offering of rivalling ideas, the truth of which no one can know from the outset. The question of which of the many theoretical schemes will finally prove to be especially productive and will be maintained after empirical investigation cannot be decided a priori. Yet to be useful at all, it is necessary that they are initially formulated so as to be subject to the risk of being revealed as errors. Thus one cannot attempt to preserve them from failure at every price. A theory is scientifically relevant first of all because of its possible explanatory power, its performance, which is coupled with its informational content …

Clearly, it is possible to interpret the ‘presuppositions’ of a theoretical system … not as hypotheses, but simply as limitations to the area of application of the system in question. Since a relationship to reality is usually ensured by the language used in economic statements, in this case the impression is generated that a content-laden statement about reality is being made, although the system is fully immunized and thus without content. In my view that is often a source of self-deception in pure economic thought …

Most mainstream economic models are abstract, unrealistic and presenting mostly non-testable hypotheses. How then are they supposed to tell us anything about the world we live in?

Confronted with the massive empirical failures of their models and theories, mainstream economists often retreat into looking upon their models and theories as some kind of “conceptual exploration,” and give up any hopes/pretenses whatsoever of relating their theories and models to the real world. Instead of trying to bridge the gap between models and the world, one decides to look the other way.

To me this kind of scientific defeatism is equivalent to surrendering our search for understanding the world we live in. It can’t be enough to prove or deduce things in a model world. If theories and models do not directly or indirectly tell us anything of the world we live in – then why should we waste any of our precious time on them?

On the non-existence of economic laws

21 February, 2016 at 09:57 | Posted in Economics | Comments Off on On the non-existence of economic laws

07710418zIn methodischer Hinsicht verdankt das ökonomische Erkenntnisprogramm ohne Zweifel dem Einfluss der klassischen Physik eine wichtige Komponente, nämlich den Gedanken, dass die sozialen Phänomene ebenso von Gesetzmässigkeiten beherrscht sind wie die Naturerscheinungen und dass es daher angezeigt ist, solche Gesetzmässigkeiten zu suchen und theoretisch in ähnlicher weise zu kodifizieren, wie Newton das für die Gesetze der Mechanik in seinem System geleistet hatte.


The crux of these laws — and regularities — that allegedly do exist in economics, is that they only hold ceteris paribus. That fundamentally means — as repeatedly and strongly argued by e. g. Nancy Cartwright — that these laws/regularites only hold when the right conditions are at hand for giving rise to them. Unfortunately, from an empirical point of view, those conditions are only at hand in artificially closed nomological models purposely designed to give rise to the kind of regular associations that economists want to explain. But, really, since these laws/regularities do not exist outside these ‘socio-economic machines,’ what’s the point in constructing these non-existent laws/regularities? When the almost endless list of narrow and specific assumptions necessary to allow the ‘rigorous’ deductions are known to be at odds with reality, what good do these models do?

Take The Law of Demand.

Although it may (perhaps) be said that neoclassical economics had succeeded in establishing The Law – when the price of a commodity falls, the demand for it will increase — for single individuals, it soon turned out, in the Sonnenschein-Mantel-Debreu theorem, that it wasn’t possible to extend The Law to apply on the market level, unless one made ridiculously unrealistic assumptions such as individuals all having homothetic preferences – which actually implies that all individuals have identical preferences.

This could only be conceivable if there was in essence only one actor – the (in)famous representative actor. So, yes, it was possible to generalize The Law of Demand – as long as we assumed that on the aggregate level there was only one commodity and one actor. What generalization! Does this sound reasonable? Of course not. This is pure nonsense!

How has neoclassical economics reacted to this devastating findig? Basically by looking the other way, ignoring it and hoping that no one sees that the emperor is naked.

Modern mainstream neoclassical textbooks try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And – worse still – something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Of course one could say that it is too difficult on undergraduate levels to show why the procedure is right and to defer it to masters and doctoral courses. It could justifiably be reasoned that way – if what you teach your students is true, if The Law of Demand is generalizable to the market level and the representative actor is a valid modeling abstraction! But in this case it’s demonstrably known to be false, and therefore this is nothing but a case of scandalous intellectual dishonesty. It’s like telling your students that 2 + 2 = 5 and hope that they will never run into Peano’s axioms of arithmetics.

As Hans Albert has it:

hansalbertThe neoclassical style of thought – with its emphasis on thought experiments, reflection on the basis of illustrative examples and logically possible extreme cases, its use of model construction as the basis of plausible assumptions, as well as its tendency to decrease the level of abstraction, and similar procedures – appears to have had such a strong influence on economic methodology that even theoreticians who strongly value experience can only free themselves from this methodology with difficulty …

Clearly, it is possible to interpret the ‘presuppositions’ of a theoretical system … not as hypotheses, but simply as limitations to the area of application of the system in question. Since a relationship to reality is usually ensured by the language used in economic statements, in this case the impression is generated that a content-laden statement about reality is being made, although the system is fully immunized and thus without content. In my view that is often a source of self-deception in pure economic thought …

Arrow on microfoundational reductionism

21 February, 2016 at 09:20 | Posted in Economics | Comments Off on Arrow on microfoundational reductionism

???????????????????????????????????????????????????????????????????????????????????????????????????????????????The economy is irreducible … in the sense that no matter how the households are divided into two groups, an increase in the initial assets held by the members of one group can be used to make feasible an allocation which will make no one worse off and at least one individual in the second group better off.

It is perhaps interesting to observe that “atomistic” assumptions concerning individual households and firms are not sufficient to establish the existence of equilibrium; “global” assumptions … are also needed (though they are surely unexceptionable). Thus, a limit is set to the tendency implicit in price theory, particularly in its mathematical versions, to deduce all properties of aggregate behavior from assumptions about individual economic agents.

Kenneth Arrow


20 February, 2016 at 20:32 | Posted in Varia | Comments Off on Jerusalem


Vissa filmer griper en mer än andra. Ofta vet man inte varför. Men inte alltid. Ibland vet man att det inte gärna kan bli annat än storslaget, gripande och oändligt vackert. För tar man ett litterärt geni (Selma Lagerlöf), en mästerregissör (Bille August) och en gudabenådad filmmusikskapare (Stefan Nilsson) – och lägg därtill skådespelare som Maria Bonnevie, Ulf Friberg och Lena Endre – ja då kan det inte bli annat än ett mästerverk.

Filmen Jerusalem lämnar ingen oberörd. Det är en film stor som livet självt.

Unbroken and unconquerable

19 February, 2016 at 09:13 | Posted in Varia | Comments Off on Unbroken and unconquerable


This one is for you — Edward Snowden.
Bravest of the brave.
Never give in.
Never give up.

Next Page »

Blog at
Entries and comments feeds.