Top 20 Heterodox Economics Books

28 Feb, 2014 at 12:04 | Posted in Economics | 29 Comments

Top 20 KC Cafe Radio Logo
 

  • Karl Marx, Das Kapital (1867)
  • Thorstein Veblen, The Theory of the Leisure Class (1899)
  • Joseph Schumpeter, The Theory of Economic Development (1911)
  • Nikolai Kondratiev, The Major Economic Cycles (1925)
  • Gunnar Myrdal, The Political Element in the Development of Economic Theory (1930)
  • John Maynard Keynes, The General Theory (1936)
  • Paul Sweezy, Theory of Capitalist Development (1956)
  • Joan Robinson, Accumulation of Capital (1956)
  • John Kenneth Galbraith, The Affluent Society (1958)
  • Piero Sraffa, Production of Commodities by Means of Commodities (1960)
  • Johan Åkerman, Theory of Industrialism (1961)
  • Axel Leijonhufvud, Keynes and the Classics (1969)
  • Nicholas Georgescu-Roegen, The Entropy Law and the Economic Process (1971)
  • Michal Kalecki, Selected Essays on the Dynamics of the Capitalist Economy (1971)
  • Paul Davidson, Money and the Real World (1972)
  • Hyman Minsky, John Maynard Keynes (1975)
  • Philip Mirowski, More Heat than Light (1989)
  • Tony Lawson, Economics and Reality (1997)
  • Steve Keen, Debunking Economics (2001)
  • John Quiggin, Zombie Economics (2010)

Teaching of economics — captured by a small and dangerous sect

28 Feb, 2014 at 10:56 | Posted in Economics | 4 Comments

Dept_of_Econ_Fac_Pic
 
The fallacy of composition basically consists of the false belief that the whole is nothing but the sum of its parts.  In the society and in the economy this is arguably not the case. An adequate analysis of society and economy a fortiori can’t proceed by just adding up the acts and decisions of individuals. The whole is more than a sum of parts.

This fact shows up when orthodox/mainstream/neoclassical economics tries to argue for the existence of The Law of Demand – when the price of a commodity falls, the demand for it will increase – on the aggregate. Although it may be said that one succeeds in establishing The Law for single individuals it soon turned out – in the Sonnenschein-Mantel-Debreu theorem firmly established already in 1976 – that it wasn’t possible to extend The Law of Demand to apply on the market level, unless one made ridiculously unrealistic assumptions such as individuals all having homothetic preferences – which actually implies that all individuals have identical preferences.

This could only be conceivable if all agents are identical (i. e. there is in essence only one actor) — the (in)famous representative actor. So, yes, it was possible to generalize The Law of Demand – as long as we assumed that on the aggregate level there was only one commodity and one actor. What generalization! Does this sound reasonable? Of course not. This is pure nonsense!

How has neoclassical economics reacted to this devastating findig? Basically by looking the other way, ignoring it and hoping that no one sees that the emperor is naked.

Having gone through a handful of the most frequently used textbooks of economics at the undergraduate level today, I can only conclude that the models that are presented in these modern neoclassical textbooks try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent.

That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

So what modern economics textbooks present to students are really models built on the assumption that an entire economy can be modeled as a representative actor and that this is a valid procedure. But it isn’t — as the Sonnenschein-Mantel-Debreu theorem irrevocably has shown.

Of course one could say that it is too difficult on undergraduate levels to show why the procedure is right and to defer it to masters and doctoral courses. It could justifiably be reasoned that way – if what you teach your students is true, if The Law of Demand is generalizable to the market level and the representative actor is a valid modeling abstraction! But in this case it’s demonstrably known to be false, and therefore this is nothing but a case of scandalous intellectual dishonesty. It’s like telling your students that 2 + 2 = 5 and hope that they will never run into Peano’s axioms of arithmetics.

Once the dust has settled, there is a strong case for an inquiry into whether the teaching of economics has been captured by a small but dangerous sect.

Larry Elliott/The Guardian

28 February 1986 — a date which will live in infamy

28 Feb, 2014 at 09:06 | Posted in Politics & Society | Comments Off on 28 February 1986 — a date which will live in infamy

Olof Palme. Born 30 January 1927. Murdered 28 February 1986.
 

Robert “The Keynesian” Lucas

27 Feb, 2014 at 16:28 | Posted in Economics | 1 Comment

In his Keynote Address to the 2003 History of Political Economy Conference, Nobel laureate Robert Lucas said:

Well, I’m not here to tell people in this group about the history of
monetary thought. I guess I’m here as a kind of witness from a vanished
culture, the heyday of Keynesian economics. It’s like historians rushing
to interview the last former slaves before they died, or the last of the
people who remembered growing up in a Polish shtetl. I am going to tell
you what it was like growing up in a day when Keynesian economics
was taught as a solid basis on which macroeconomics could proceed.

keynesdanceMy credentials? Was I a Keynesian myself? Absolutely. And does my
Chicago training disqualify me for that? No, not at all. David Laidler
[who was present at the conference] will agree with me on this, and I will
explain in some detail when I talk about my education. Our Keynesian
credentials, if we wanted to claim them, were as good as could be obtained
in any graduate school in the country in 1963.

I thought when I was trying to prepare some notes for this talk
that people attending the conference might be arguing about Axel
Leijonhufvud’s thesis that IS-LM was a distortion of Keynes, but I didn’t
really hear any of this in the discussions this afternoon. So I’m going to
think about IS-LM and Keynesian economics as being synonyms. I remember
when Leijonhufvud’s book2 came out and I asked my colleague
Gary Becker if he thought Hicks had got the General Theory right with
his IS-LM diagram. Gary said, “Well, I don’t know, but I hope he did,
because if it wasn’t for Hicks I never would have made any sense out of
that damn book.” That’s kind of the way I feel, too, so I’m hoping Hicks
got it right.

Mirabile dictu! I’m a Keynesian – although I haven’t understood anything of what Keynes wrote, but I’ve read anoher guy who said he had read his book, so I hope for the best and assume he got it right (which Hicks actually didn’t, and was intellectually honest to admit in at least three scientific publications published about twenty years before Lucas statement). In truth a very scientific attitude. No wonder the guy after having deluded himself into believing (?) being a Keynesian – although actually only elaborating upon a model developed and then disowned by John Hicks – got the “Nobel prize” in economics …

On chance, probability, randomness, uncertainty and all that

26 Feb, 2014 at 23:04 | Posted in Statistics & Econometrics | 2 Comments

 

How to escape Mankiwian brainwash

25 Feb, 2014 at 16:56 | Posted in Economics | 1 Comment

 

[h/t Phil Pilkington]

Taking rational expectations seriously? You’ve got to be joking!

25 Feb, 2014 at 10:53 | Posted in Economics | 1 Comment

shackleIf at some time my skeleton should come to be used by a teacher of osteology to illustrate his lectures, will his students seek to infer my capacities for thinking, feeling, and deciding from a study of my bones? If they do, and any report of their proceedings should reach the Elysian Fields, I shall be much distressed, for they will be using a model which entirely ignores the greater number of relevant variables, and all of the important ones. Yet this is what ‘rational expectations’ does to economics.

G. L. S. Shackle

Since I have already put forward a rather detailed theoretical-methodological critique of the rational expectations hypothesis in Rational expectations – a fallacious foundation for macroeconomics in a non-ergodic world (real-world economics review, issue 62, 2012), I will limit myself here to elaborate on a couple of the rather unwarranted allegations that defenders of rational expectations put forward in their attempts at rescuing the rational expectations hypothesis from the critique.

In a laboratory experiment run by James Andreoni and Tymofiy Mylovanov, the researchers induced common probability priors, and then told all participants of the actions taken by the others. Their findings are very interesting, and says something rather profound on the value of the rational expectations hypothesis in standard neoclassical economic models:

We look at choices in round 1, when individuals should still maintain common priors, being indifferent about the true state. Nonetheless, we see that about 20% of the sample erroneously disagrees and favors one point of view. Moreover, while other errors tend to diminish as the experiment progresses, the fraction making this type of error is nearly constant. One may interpret disagreement in this case as evidence of erroneous or nonrational choices.

Next, we look at the final round where information about disagreement is made public and, under common knowledge of rationality, should be sufficient to eliminate disagreement. Here we find that individuals weigh their own information more than twice that of the five others in their group. When we look separately at those who err by disagreeing in round 1, we find that these people weigh their own information more than 10 times that of others, putting virtually no stock in public information. This indicates a different type of error, that is, a failure of some individuals to learn from each other. This error is quite large and for a nontrivial minority of the population.

Setting aside the subjects who make systematic errors, we find that individuals still put 50% more weight on their own information than they do on the information revealed through the actions of others, although this difference is not statistically significant.

So in this experiment there seems to be some irrational idiots who don’t understand that they are exactly that — idiots. When told that the earth is flat they still adhere to their own beliefs of a circular earth. It is as if people thought that the probability that all others are idiots — with irrational beliefs –is higher than the probability that the earth is circular.

Now compare these experimental results with rational expectations models, where the world evolves in accordance with fully predetermined models where uncertainty has been reduced to stochastic risk describable by some probabilistic distribution.

The tiny little problem that there is no hard empirical evidence that verifies these models doesn’t usually bother its protagonists too much. Rational expectations überpriest Thomas Sargent has the following to say on the epistemological status of the rational expectations hypothesis (emphasis added):

Partly because it focuses on outcomes and does not pretend to have behavioral content, the hypothesis of rational epectations has proved to be a powerful tool for making precise statements about complicarted dynamic economic systems.

Precise, yes, but relevant and realistic? I’ll be dipped!

And a few years later, when asked in an interview in Macroeconomic Dynamics — in 2005 — if he thought “that differences among people’s models are important aspects of macroeconomic policy debates”, Sargent replied (emphasis added):

The fact is you simply cannot talk about their differences within the typical rational expectations model. There is a communism of models. All agents within the model, the econometricians, and God share the same model.

One might perhaps find it odd to juxtapose God and people, but I think Leonard Rapping – himself a former rational expectationist – was on the right track when in 1984 interviewed by Arjo Klamer — The New Classical Macroeconomics — he said:

Frankly, I do not think that the rational expectations theorists are in the real world. Their approach is much to abstract.

Building models on rational expectations either means we are Gods or Idiots. Most of us know we are neither. So, God may share Sargent’s and Wren-Lewis’s models, but they certainly aren’t my models.

In their attempted rescue operations, rational expectationists try to give the picture that only heterodox economists like yours truly are critical of the rational expectations hypothesis. But, on this, they are, simply, eh, wrong. Let’s listen to Nobel laureate Edmund Phelps — hardly a heterodox economist — and what he has to say (emphasis added):

Question: In a new volume with Roman Frydman, “Rethinking Expectations: The Way Forward for Macroeconomics,” you say the vast majority of macroeconomic models over the last four decades derailed your “microfoundations” approach. Can you explain what that is and how it differs from the approach that became widely accepted by the profession?

frydAnswer: In the expectations-based framework that I put forward around 1968, we didn’t pretend we had a correct and complete understanding of how firms or employees formed expectations about prices or wages elsewhere. We turned to what we thought was a plausible and convenient hypothesis. For example, if the prices of a company’s competitors were last reported to be higher than in the past, it might be supposed that the company will expect their prices to be higher this time, too, but not that much. This is called “adaptive expectations:” You adapt your expectations to new observations but don’t throw out the past. If inflation went up last month, it might be supposed that inflation will again be high but not that high.

Q: So how did adaptive expectations morph into rational expectations?

A: The “scientists” from Chicago and MIT came along to say, we have a well-established theory of how prices and wages work. Before, we used a rule of thumb to explain or predict expectations: Such a rule is picked out of the air. They said, let’s be scientific. In their mind, the scientific way is to suppose price and wage setters form their expectations with every bit as much understanding of markets as the expert economist seeking to model, or predict, their behavior. The rational expectations approach is to suppose that the people in the market form their expectations in the very same way that the economist studying their behavior forms her expectations: on the basis of her theoretical model.

Q: And what’s the consequence of this putsch?

A: Craziness for one thing. You’re not supposed to ask what to do if one economist has one model of the market and another economist a different model. The people in the market cannot follow both economists at the same time. One, if not both, of the economists must be wrong. Another thing: It’s an important feature of capitalist economies that they permit speculation by people who have idiosyncratic views and an important feature of a modern capitalist economy that innovators conceive their new products and methods with little knowledge of whether the new things will be adopted — thus innovations. Speculators and innovators have to roll their own expectations. They can’t ring up the local professor to learn how. The professors should be ringing up the speculators and aspiring innovators. In short, expectations are causal variables in the sense that they are the drivers. They are not effects to be explained in terms of some trumped-up causes.

Q: So rather than live with variability, write a formula in stone!

A: What led to rational expectations was a fear of the uncertainty and, worse, the lack of understanding of how modern economies work. The rational expectationists wanted to bottle all that up and replace it with deterministic models of prices, wages, even share prices, so that the math looked like the math in rocket science. The rocket’s course can be modeled while a living modern economy’s course cannot be modeled to such an extreme. It yields up a formula for expectations that looks scientific because it has all our incomplete and not altogether correct understanding of how economies work inside of it, but it cannot have the incorrect and incomplete understanding of economies that the speculators and would-be innovators have.

Q: One of the issues I have with rational expectations is the assumption that we have perfect information, that there is no cost in acquiring that information. Yet the economics profession, including Federal Reserve policy makers, appears to have been hijacked by Robert Lucas.

A: You’re right that people are grossly uninformed, which is a far cry from what the rational expectations models suppose. Why are they misinformed? I think they don’t pay much attention to the vast information out there because they wouldn’t know what to do what to do with it if they had it. The fundamental fallacy on which rational expectations models are based is that everyone knows how to process the information they receive according to the one and only right theory of the world. The problem is that we don’t have a “right” model that could be certified as such by the National Academy of Sciences. And as long as we operate in a modern economy, there can never be such a model.

Bloomberg

And this is what another non-heterodox economist, Willem Buiter, has to say about the state of the standard macroeconomic theory that builds on the twin assumptions of rational expectations and efficient markets:

buiterIn both the New Classical and New Keynesian approaches to monetary theory (and to aggregative macroeconomics in general), the strongest version of the efficient markets hypothesis (EMH) was maintained. This is the hypothesis that asset prices aggregate and fully reflect all relevant fundamental information, and thus provide the proper signals for resource allocation. Even during the seventies, eighties, nineties and noughties before 2007, the manifest failure of the EMH in many key asset markets was obvious to virtually all those whose cognitive abilities had not been warped by a modern Anglo-American Ph.D. education. But most of the profession continued to swallow the EMH hook, line and sinker, although there were influential advocates of reason throughout, including James Tobin, Robert Shiller, George Akerlof, Hyman Minsky, Joseph Stiglitz and behaviourist approaches to finance. The influence of the heterodox approaches from within macroeconomics and from other fields of economics on mainstream macroeconomics – the New Classical and New Keynesian approaches – was, however, strictly limited.

But let’s see how rational expectations fares as an empirical assumption. Empirical efforts at testing the correctnes of the hypothesis has resulted in a series of empirical studies that have more or less concluded that it is not consistent with the facts. In one of the more well-known and highly respected evaluation reviews made, Michael Lovell (1986) concluded:

it seems to me that the weight of empirical evidence is sufficiently strong to compel us to suspend belief in the hypothesis of rational expectations, pending the accumulation of additional empirical evidence.

And this is how Nikolay Gertchev summarizes studies on the empirical correctness of the hypothesis:

More recently, it even has been argued that the very conclusions of dynamic models assuming rational expectations are contrary to reality: “the dynamic implications of many of the specifications that assume rational expectations and optimizing behavior are often seriously at odds with the data” (Estrella and Fuhrer 2002, p. 1013). It is hence clear that if taken as an empirical behavioral assumption, the RE hypothesis is plainly false; if considered only as a theoretical tool, it is unfounded and selfcontradictory.

But how about the large mainstream literature on learning? Let me shortly adress the issue.

The rational expectations hypothesis presupposes – basically for reasons of consistency – that agents have complete knowledge of all of the relevant probability distribution functions. And when trying to incorporate learning in these models – trying to take the heat of some of the criticism launched against it up to date – it is always a very restricted kind of learning that is considered. A learning where truly unanticipated, surprising, new things never take place, but only rather mechanical updatings – increasing the precision of already existing information sets – of existing probability functions.

Nothing really new happens in these ergodic models, where the statistical representation of learning and information is nothing more than a caricature of what takes place in the real world target system. This follows from taking for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution) that can be thought of taking place.

But in the real world it is – as shown again and again by behavioural and experimental economics – common to mistake a conditional distribution for a probability distribution. Mistakes that are impossible to make in the kinds of economic analysis – built on the rational expectations hypothesis – that Levine is such an adamant propagator for. On average rational expectations agents are always correct. But truly new information will not only reduce the estimation error but actually change the entire estimation and hence possibly the decisions made. To be truly new, information has to be unexpected. If not, it would simply be inferred from the already existing information set.

In rational expectations models new information is typically presented as something only reducing the variance of the parameter estimated. But if new information means truly new information it actually could increase our uncertainty and variance (information set (A, B) => (A, B, C)).

Truly new information give birth to new probabilities, revised plans and decisions – something the rational expectations hypothesis cannot account for with its finite sampling representation of incomplete information.

In the world of rational expectations, learning is like being better and better at reciting the complete works of Shakespeare by heart – or at hitting bull’s eye when playing dart. It presupposes that we have a complete list of the possible states of the world and that by definition mistakes are non-systematic (which, strictly seen, follows from the assumption of “subjective” probability distributions being equal to the “objective” probability distribution). This is a rather uninteresting and trivial kind of learning. It is a closed world learning, synonymous to improving one’s adaptation to a world which is fundamentally unchanging. But in real, open world situations, learning is more often about adapting and trying to cope with genuinely new phenomena.

The rational expectations hypothesis presumes consistent behaviour, where expectations do not display any persistent errors. In the world of rational expectations we are always, on average, hitting the bull’s eye. In the more realistic, open systems view, there is always the possibility (danger) of making mistakes that may turn out to be systematic. It is because of this, presumably, that we put so much emphasis on learning in our modern knowledge societies.

So, where does all this leave us? I think John Kay sums it up pretty well:

kayProf Sargent and colleagues appropriated the term “rational expectations” for their answer. Suppose the economic world evolves according to some predetermined model, in which uncertainties are “known unknowns” that can be described by probability distributions. Then economists could gradually deduce the properties of this model, and businesses and individuals would naturally form expectations in that light. If they did not, they would be missing obvious opportunities for advantage.

This approach, which postulates a universal explanation into which economists have privileged insight, was as influential as it was superficially attractive. But a scientific idea is not seminal because it influences the research agenda of PhD students. An important scientific advance yields conclusions that differ from those derived from other theories, and establishes that these divergent conclusions are supported by observation. Yet as Prof Sargent disarmingly observed, “such empirical tests were rejecting too many good models” in the programme he had established with fellow Nobel laureates Bob Lucas and Ed Prescott. In their world, the validity of a theory is demonstrated if, after the event, and often with torturing of data and ad hoc adjustments that are usually called “imperfections”, it can be reconciled with already known facts – “calibrated”. Since almost everything can be “explained” in this way, the theory is indeed universal; no other approach is necessary, or even admissible …

Rational expectations consequently fail for the same reason communism failed – the arrogance and ignorance of the monopolist.

David Ricardo

25 Feb, 2014 at 08:56 | Posted in Economics | 1 Comment

ricardoI morgon sänder Sveriges Radio P1 — kl. 21:03-21:37 — det andra programmet i en serie om nationalekonomins historia.

Yours truly medverkar.
 

Det sägs att börshandlaren David Ricardo blev nationalekonom för att han hade tråkigt på semestern. Detta kan vara en skröna, men sant är däremot att hans sätt att resonera logiskt och skapa modeller av verkligheten blev stilbildande.

Hicks on the inapplicability of probability calculus

24 Feb, 2014 at 18:48 | Posted in Economics, Statistics & Econometrics | 11 Comments

To understand real world “non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty — where real historical time rules the roost — the probabilities that ruled the past are not necessarily those that will rule the future.

hicksbbcWhen we cannot accept that the observations, along the time-series available to us, are independent … we have, in strict logic, no more than one observation, all of the separate items having to be taken together. For the analysis of that the probability calculus is useless; it does not apply … I am bold enough to conclude, from these considerations that the usefulness of ‘statistical’ or ‘stochastic’ methods in economics is a good deal less than is now conventionally supposed … We should always ask ourselves, before we apply them, whether they are appropriate to the problem in hand. Very often they are not … The probability calculus is no excuse for forgetfulness.

John Hicks, Causality in Economics, 1979:121

To simply assume that economic processes are ergodic — and a fortiori in any relevant sense timeless — is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

Added 25 February: Commenting on this article, Paul Davidson writes:

After reading my article on the fallacy of rational expectations, Hicks wrote to me in a letter dated 12 February 1983 in which he said “I have just been reading your RE [rational expectations] paper … I do like it very much … You have now rationalized my suspicions and shown me that I missed a chance of labeling my own point of view as nonergodic. One needs a name like that to ram a point home.”

On abstraction and idealization in neoclassical economics

23 Feb, 2014 at 20:55 | Posted in Economics | 2 Comments

Of course macroeconomists research many things, and only a minority are using New Keynesian models, and probably even some of those do not really need the New Keynesian bit. That is the great thing about abstraction. Working with what can be called ‘flex price’ models does not imply that you think price rigidity is unimportant, but instead that it can often be ignored if you want to focus on other processes.

Simon Wren-Lewis

It would, of course, be interesting to know on what reasoning Wren-Lewis based this rather unsubstantiated view.

When applying deductivist thinking to economics, neoclassical economists like Wren-Lewis usually set up “as if” models based on a set of tight axiomatic assumptions from which consistent and precise inferences are made. The beauty of this procedure is of course that if the axiomatic premises are true, the conclusions necessarily follow. The snag is that if the models are to be relevant, we also have to argue that their precision and rigour still holds when they are applied to real-world situations. They often don’t. When addressing real economies, the idealizations (what Wren-Lewis, wrongly, calls abstractions) necessary for the deductivist machinery to work — as e. g. “flex price” models — simply don’t hold.

If the real world is fuzzy, vague and indeterminate, then why should our models build upon a desire to describe it as precise and predictable? The logic of idealization is a marvellous tool in mathematics and axiomatic-deductivist systems, but a poor guide for action in real-world systems, in which concepts and entities are without clear boundaries and continually interact and overlap.

Or as Hans Albert has it on the neoclassical style of thought:

In everyday situations, if, in answer to an inquiry about the weather forecast, one is told that the weather will remain the same as long as it does not change, then one does not normally go away with the impression of having been particularly well informed, although it cannot be denied that the answer refers to an interesting aspect of reality, and, beyond that, it is undoubtedly true …

hansalbertWe are not normally interested merely in the truth of a statement, nor merely in its relation to reality; we are fundamentally interested in what it says, that is, in the information that it contains …

Information can only be obtained by limiting logical possibilities; and this in principle entails the risk that the respective statement may be exposed as false. It is even possible to say that the risk of failure increases with the informational content, so that precisely those statements that are in some respects most interesting, the nomological statements of the theoretical hard sciences, are most subject to this risk. The certainty of statements is best obtained at the cost of informational content, for only an absolutely empty and thus uninformative statement can achieve the maximal logical probability …

The neoclassical style of thought – with its emphasis on thought experiments, reflection on the basis of illustrative examples and logically possible extreme cases, its use of model construction as the basis of plausible assumptions, as well as its tendency to decrease the level of abstraction, and similar procedures – appears to have had such a strong influence on economic methodology that even theoreticians who strongly value experience can only free themselves from this methodology with difficulty …

Science progresses through the gradual elimination of errors from a large offering of rivalling ideas, the truth of which no one can know from the outset. The question of which of the many theoretical schemes will finally prove to be especially productive and will be maintained after empirical investigation cannot be decided a priori. Yet to be useful at all, it is necessary that they are initially formulated so as to be subject to the risk of being revealed as errors. Thus one cannot attempt to preserve them from failure at every price. A theory is scientifically relevant first of all because of its possible explanatory power, its performance, which is coupled with its informational content …

The connections sketched out above are part of the general logic of the sciences and can thus be applied to the social sciences. Above all, with their help, it appears to be possible to illuminate a methodological peculiarity of neoclassical thought in economics, which probably stands in a certain relation to the isolation from sociological and social-psychological knowledge that has been cultivated in this discipline for some time: the model Platonism of pure economics, which comes to expression in attempts to immunize economic statements and sets of statements (models) from experience through the application of conventionalist strategies …

Clearly, it is possible to interpret the ‘presuppositions’ of a theoretical system … not as hypotheses, but simply as limitations to the area of application of the system in question. Since a relationship to reality is usually ensured by the language used in economic statements, in this case the impression is generated that a content-laden statement about reality is being made, although the system is fully immunized and thus without content. In my view that is often a source of self-deception in pure economic thought …

A further possibility for immunizing theories consists in simply leaving open the area of application of the constructed model so that it is impossible to refute it with counter examples. This of course is usually done without a complete knowledge of the fatal consequences of such methodological strategies for the usefulness of the theoretical conception in question, but with the view that this is a characteristic of especially highly developed economic procedures: the thinking in models, which, however, among those theoreticians who cultivate neoclassical thought, in essence amounts to a new form of Platonism.

Tout simplement superbe

23 Feb, 2014 at 14:56 | Posted in Varia | 1 Comment

 

Bobubblan bara växer och växer i Sverige

20 Feb, 2014 at 11:55 | Posted in Economics | 6 Comments

index_realt_villor_br_1952-2013

Som  Cornucopia  visar i ovanstående diagram har reala bostadsrättspriser stigit med nästan 900 % de senaste 30 åren! 

Om detta inte är en bubbla vet jag inte vad som skulle kunna vara det.

Men det kanske är bäst att fråga L E O Svensson först …

Added: Martin Flodén och Flute har också tänkvärda bubbeliakttagelser.

[h/t Erik Hegelund]

Macroeconomic challenges

20 Feb, 2014 at 10:15 | Posted in Economics | 8 Comments

In discussing macroeconomics’ Faustian bargain, Simon [Wren-Lewis] asks:

“By putting all our macroeconomic model building eggs in one microfounded basket, have we significantly slowed down the pace at which macroeconomists can say something helpful about the rapidly changing real world?”

representative agentLet me deepen this question by pointing to five newish facts about the “real world” which any good, useful macro theory should be compatible with.

1. The unemployed are significantly less happy than those in work. This doesn’t merely provide the justification for an interest in macroeconomics. It also casts grave doubt upon RBC-style theories which unemployment is voluntary …

2. Price and wage stickiness is over-rated … Price stickiness isn’t universal …

3. The failure of a handful of organizations can have massive macroeconomic consequences … We need models in which micro failures generate macro ones …

4. Supply shocks do happen. It’s improbable that all productivity fluctuations are due merely to labour hoarding in the face of demand shocks …

5. Interactions between agents can magnify fluctuations. We know there are expenditure cascades, which occur because consumers copy other consumers …

These facts are a challenge to both RBC and New Keynesian models. But they have something in common. They stress the heterogeneity of agents … This, I fear, means that the problem with conventional macro isn’t so much its microfoundations per se as the assumption that these microfoundations must consist in representative agents.

Chris Dillow

Yes indeed, the assumption of representative agents is a critical one in modern macroeconomics — as is the insistence on microfoundations.

The purported strength of New Classical and “New Keynesian” macroeconomics is that they have firm anchorage in preference-based microeconomics, and especially the decisions taken by inter-temporal utility maximizing “forward-loooking” individuals.

To some of us, however, this has come at too high a price. The almost quasi-religious insistence that macroeconomics has to have microfoundations – without ever presenting neither ontological nor epistemological justifications for this claim – has put a blind eye to the weakness of the whole enterprise of trying to depict a complex economy based on an all-embracing representative actor equipped with superhuman knowledge, forecasting abilities and forward-looking rational expectations. It is as if – after having swallowed the sour grapes of the Sonnenschein-Mantel-Debreu-theorem – these economists want to resurrect the omniscient walrasian auctioneer in the form of all-knowing representative actors equipped with rational expectations and assumed to somehow know the true structure of our model of the world (how that could even be conceivable is beyond my imagination, given that the ongoing debate on microfoundations, if anything, shows that not even we, the economists, can come to agreement on a common model).

Microfoundations is thought to give macroeconomists the means to fully predetermine their models and come up with definitive, robust, stable, answers. In reality we know that the forecasts and expectations of individuals often differ systematically from what materialize in the aggregate, since knowledge is imperfect and uncertainty – rather than risk – rules the roost.

And microfoundations allegedly goes around the Lucas critique by focussing on “deep” structural, invariant parameters of optimizing individuals’ preferences and tastes. This is an empty hope without solid empirical or methodological foundation.

The kind of microfoundations that “New Keynesian” and New Classical general equilibrium macroeconomists are basing their models on, are not – at least from a realist point of view – plausible.

Without export certificates models and theories should be considered unsold. Unfortunately this understanding has not informed modern neoclassical economics, as can be seen by the profuse use of so called representative-agent models.

A common feature of modern neoclassical macroeconomics is to use simple (dynamic stochastic) general equilibrium models where representative actors are supposed to have complete knowledge, zero transaction costs and complete markets.

In these models, the actors are all identical. Of course, this has far-reaching analytical implications. Situations characterized by asymmetrical information – situations most of us consider to be innumerable – cannot arise in such models. If the aim is to build a macro-analysis from micro-foundations in this manner, the relevance of the procedure is highly questionable (Robert Solow has even considered the claims made by protagonists of rational agent models “generally phony”).

One obvious critique is that representative-agent models do not incorporate distributional effects – effects that often play a decisive role in macroeconomic contexts. Investigations into the operations of markets and institutions usually find that there are overwhelming problems of coordination. These are difficult, not to say impossible, to analyze with the kind of Robinson Crusoe models that, e. g., Real Business Cycle theorists employ and which exclude precisely those differences between groups of actors that are the driving force in many non-neoclassical analysis.

The choices of different individuals have to be shown to be coordinated and consistent. This is obviously difficult if the macroeconomic models don’t give room for heterogeneous individuals (this lack of understanding the importance of heterogeneity is perhaps especially problematic for the modeling of real business cycles in dynamic stochastic general equilibrium models). Representative-agent models are certainly more manageable, however, from a realist point of view, they are also less relevant and have a lower explanatory potential.

Both the “Lucas critique” and Keynes’ critique of econometrics showed that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And as if this was not enough, there are obvious problems also with the kind of microeconomic equilibrium that one tries to reduce macroeconomics to. Decisions of consumption and production are described as choices made by a single agent. But then, who sets the prices on the market? And how do we justify the assumption of universal consistency between the choices?

Models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

And is it really possible to describe and analyze all the deliberations and choices made by individuals in an economy? Does not the choice of an individual presuppose knowledge and expectations about choices of other individuals? It probably does, and this presumably helps to explain why representative-agent models have become so popular in modern macroeconomic theory. They help to make the analysis more tractable.

One could justifiably argue that one might just as well accept that it is not possible to coherently reduce macro to micro, and accordingly that it is perhaps necessary to forswear microfoundations and the use of rational-agent models all together. Microeconomic reasoning has to build on macroeconomic presuppositions. Real individuals do not base their choices on operational general equilibrium models, but rather use simpler models. If macroeconomics needs microfoundations it is equally necessary that microeconomics needs macrofoundations.

The microeconomist Alan Kirman has maintained that the use of representative-agent models is unwarranted and leads to conclusions that are usually both misleading and false. It’s a fiction basically used by some macroeconomists to justify the use of equilibrium analysis and a kind of pseudo-microfoundations. Microeconomists are well aware that the conditions necessary to make aggregation to representative actors possible, are not met in actual economies. As economic models become increasingly complex, their use also becomes less credible.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, as we have already argued, even these microfoundations aren’t immutable. Lucas and the New Classical economists’ deep parameters – “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

In microfounded-rational-expectations-representative-agent macroeconomics the economy is  described “as if” consisting of one single agent – either by inflating the optimization problem of the individual to the scale of a whole economy, or by assuming that it’s possible to aggregate different individuals’ actions by a simple summation, since every type of actor is identical.

It would be better to just face the truth — it is impossible to describe interaction and cooperation when there is essentially only one actor.

Economics for Dummiez

20 Feb, 2014 at 09:23 | Posted in Economics | 1 Comment

 

Adam Smith och den “osynliga handen”

20 Feb, 2014 at 09:21 | Posted in Economics | Comments Off on Adam Smith och den “osynliga handen”

Igår sändes i Sveriges Radio P1 ett första program i en serie om nationalekonomins historia:

wealth of nations Adam Smith gjorde national-ekonomi till modern vetenskap 1776 med boken Nationernas välstånd.

Hans tes är att egennyttigt handlande, den så kallade “osynliga handen”, på sikt gynnar hela samhället. Professor Lars Pålsson Syll ger sin syn på denna teori.

Next Page »

Blog at WordPress.com.
Entries and Comments feeds.