Chicago economics — nothing but pseudo-scientific cheating

1 Apr, 2024 at 11:51 | Posted in Economics | Comments Off on Chicago economics — nothing but pseudo-scientific cheating

Economics Professor memes | quickmemeUnlike anthropologists … economists simply invent the primitive societies we study, a practice which frees us from limiting ourselves to societies which can be physically visited as sparing us the discomforts of long stays among savages. This method of society-invention is the source of the utopian character of economics; and of the mix of distrust and envy with which we are viewed by our fellow social scientists. The point of studying wholly fictional, rather than actual societies, is that it is relatively inexpensive to subject them to external forces of various types and observe the way they react. If, subjected to forces similar to those acting on actual societies, the artificial society reacts in a similar way, we gain confidence that there are useable connections between the invented society and the one we really care about.

Robert Lucas

Neither yours truly, nor anthropologists, will recognise anything in Lucas’ description of economic theory even remotely reminiscent of practices actually used in real sciences, this quote still gives a very good picture of the methodology used by Lucas and other prominent Chicago economists.

All empirical sciences use simplifying or unrealistic assumptions in their modelling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

The implications that follow from the kind of models that people like Robert Lucas — according to Ed Prescott, ‘the master of methodology’ — are always conditional on the simplifying assumptions used — assumptions predominantly of a rather far-reaching and non-empirical character with little resemblance to features of the real world. From a descriptive point of view, there is a fortiori usually very little resemblance between the models used and the empirical world. ‘As if’ explanations building on such foundations are not really any explanations at all, since they always conditionally build on hypothesized law-like theorems and situation-specific restrictive assumptions. The empirical-descriptive inaccuracy of the models makes it more or less miraculous if they should — in any substantive way — be able to be considered explanative at all. If the assumptions that are made are known to be descriptively totally unrealistic (think of e.g. ‘rational expectations’) they are of course likewise totally worthless for making empirical inductions. Assuming — as Lucas — that people behave ‘as if’ they were rational FORTRAN programmed computers doesn’t take us far when we know that the ‘if’ is false.

The obvious shortcoming of a basically epistemic — rather than ontological — approach such as ‘successive approximations’ and ‘as if’ modelling assumptions, is that ‘similarity’, ‘analogy’ or ‘resemblance’ tout court do not guarantee that the correspondence between model and target is interesting, relevant, revealing or somehow adequate in terms of mechanisms, causal powers, capacities or tendencies. No matter how many convoluted refinements of concepts are made in the model, if the successive ‘as if’ approximations do not result in models similar to reality in the appropriate respects (such as structure, isomorphism, etc), they are nothing more than ‘substitute systems’ that do not bridge to the world but rather misses its target.

Economics building on the kind of modelling strategy that Lucas represents does not produce science.

It’s nothing but pseudo-scientific cheating.

Contrary to what some überimpressed macroeconomists seem to argue, I would say the recent economic crises and the fact that Chicago economics has had next to nothing to contribute to understanding them, shows that Lucas and his New Classical economics — in Lakatosian terms — is a degenerative research program in dire need of replacement.

The Poverty of Fictional Storytelling in Mainstream Economics : Syll, Lars  P.: Amazon.se: BooksMainstream economic theory has long been in the story-telling business whereby economic theorists create make-believe analogue models of the target system – usually conceived as the real economic system. This modelling activity is considered useful and essential. Since fully-fledged experiments on a societal scale as a rule are prohibitively expensive, ethically indefensible or unmanageable, economic theorists have to substitute experimenting with something else. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models” rather than engineering things happening in real economies.

But how do we bridge the gulf between the model and the ‘target system’? According to Lucas, we have to be willing to “argue by analogy from what we know about one situation to what we would like to know about another, quite different situation” [Lucas 1988:5]. Progress lies in the pursuit of the ambition to “tell better and better stories” [Lucas 1988:5], simply because that is what economists do.

We are storytellers, operating much of the time in worlds of make believe. We do not find that the realm of imagination and ideas is an alternative to, or retreat from, practical reality. On the contrary, it is the only way we have found to think seriously about reality. In a way, there is nothing more to this method than maintaining the conviction … that imagination and ideas matter … there is no practical alternative” [Lucas 1988:6].

Lucas applies this mode of theorizing by constructing “make-believe economic systems” to the age-old question of what causes and constitutes business cycles. According to Lucas the standard for what that means is that one “exhibits understanding of business cycles by constructing a model in the most literal sense: a fully articulated artificial economy, which behaves through time so as to imitate closely the time series behavior of actual economies” [Lucas 1981:219].

The development of macro-econometrics has according to Lucas supplied economists with “detailed, quantitatively accurate replicas of the actual economy” thereby enabling us to treat policy recommendations “as though they had been experimentally tested” [Lucas 1981:220]. But if the goal of theory is to be able to make accurate forecasts this “ability of a model to imitate actual behavior” does not give much leverage. What is required is “invariance of the structure of the model under policy variations”. Parametric invariance in an economic model cannot be taken for granted, “but it seems reasonable to hope that neither tastes nor technology vary systematically” [Lucas 1981:220].

The model should enable us to posit contrafactual questions about what would happen if some variable was to change in a specific way. Hence the assumption of structural invariance, that purportedly enables the theoretical economist to do just that. But does it? Lucas appeals to “reasonable hope”, a rather weak justification for a modeller to apply such a far-reaching assumption. To warrant it one would expect an argumentation that this assumption – whether we conceive of it as part of a strategy of “isolation”, “idealization” or “successive approximation” – really establishes a useful relation that we can export or bridge to the target system, the “actual economy.” That argumentation is neither in Lucas nor – to my knowledge – in the succeeding neoclassical refinements of his “necessarily artificial, abstract, patently ‘unreal’” analogue economies [Lucas 1981:271]. At most, we get what Lucas himself calls “inappropriately maligned” casual empiricism in the form of “the method of keeping one’s eyes open.” That is far from sufficient to warrant any credibility in a model pretending to explain the complex and difficult recurrent phenomena we call business cycles. Providing an empirical “illustration” or a “story” to back up your model does not suffice. There are simply too many competing illustrations and stories that could be exhibited or told.

Applying a “Lucas critique” on Lucas’s own model, it is obvious that it too fails. Changing “policy rules” cannot just be presumed not to influence investment and consumption behaviour and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy. Lucas’ hope of being able to model the economy as “a FORTRAN program” and “gain some confidence that the component parts of the program are in some sense reliable prior to running it” [Lucas 1981:288] therefore seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in the alleged stable deep parameters ‘tastes’ and ‘technology’ shows that if you neglect ontological considerations pertaining to the target system, ultimately reality kicks back when at last questions of bridging and exportation of model exercises are laid on the table. No matter how precise and rigorous the analysis is, and no matter how hard one tries to cast the argument in “modern mathematical form” [Lucas 1981:7] they do not push science forward one single millimetre if they do not stand the acid test of relevance to the target.

References

Lucas, Robert (1981), Studies in Business-Cycle Theory. Oxford: Basil Blackwell.

– (1988), What Economists Do.

Why the euro has to be abandoned

31 Mar, 2024 at 10:16 | Posted in Economics | Comments Off on Why the euro has to be abandoned

The euro has taken away the possibility for national governments to manage their economies in a meaningful way — and the people have​ had to pay the true costs of its concomitant misguided austerity policies.

The unfolding of the repeated economic crises in Euroland during the last decade has shown beyond any doubt that the euro is not only an economic project​ but just as much a political one. What the neoliberal revolution during the 1980s and 1990s didn’t manage to accomplish, the euro shall now force on us.

austerity22But do the peoples of Europe really want to deprive themselves of economic autonomy, enforce lower wages, and slash social welfare at the slightest sign of economic distress? Are increasing​ income inequality and a federal überstate really the stuff that our dreams are made of? Yours truly doubts it.

History ought to act as a deterrent. During the 1930s our economies didn’t come out of the depression until the folly of that time — the gold standard — was thrown on the dustbin of history. The euro will hopefully soon join it.

Economists have a tendency to get enthralled by their theories and models​​ and forget that behind the figures and abstractions, there is a real world with real people. Real people that have to pay dearly for fundamentally flawed doctrines and recommendations.

NLR95coverNow more than ever there is a grotesque gap between capitalism’s intensifying reproduction problems and the collective energy needed to resolve them … This may mean that there is no guarantee that the people who have been so kind as to present us with the euro will be able to protect us from its consequences, or will even make a serious attempt to do so. The sorcerer’s apprentices will be unable to let go of the broom with which they aimed to cleanse Europe of its pre-modern social and anti-capitalist foibles, for the sake of a neoliberal transformation of its capitalism. The most plausible scenario for the Europe of the near and not-so-near future is one of growing economic disparities—and of increasing political and cultural hostility between its peoples, as they find themselves flanked by technocratic attempts to undermine democracy on the one side, and the rise of new nationalist parties on the other. These will seize the opportunity to declare themselves the authentic champions of the growing number of so-called losers of modernization, who feel they have been abandoned by a social democracy that has embraced the market and globalization.

Wolfgang Streeck

What’s the use of economics?

26 Mar, 2024 at 13:17 | Posted in Economics | 5 Comments

The simple question that was raised during a recent conference … was to what extent has — or should — the teaching of economics be modified … The simple answer is that the economics profession is unlikely to change. Why would economists be willing to give up much of their human capital, painstakingly nurtured for over two centuries? For macroeconomists in particular, the reaction has been to suggest that modifications of existing models to take account of ‘frictions’ or ‘imperfections’ will be enough to account for the current evolution of the world economy. The idea is that once students have understood the basics, they can be introduced to these modifications …

Alan Kirman (@AlanKirman1) / XI would go further; rather than making steady progress towards explaining economic phenomena professional economists have been locked into a narrow vision of the economy. We constantly make more and more sophisticated models within that vision until, as Bob Solow put it, “the uninitiated peasant is left wondering what planet he or she is on” …

Every student in economics is faced with the model of the isolated optimising individual who makes his choices within the constraints imposed by the market. Somehow, the axioms of rationality imposed on this individual are not very convincing, particularly to first-time students. But the student is told that the aim of the exercise is to show that there is an equilibrium, there can be prices that will clear all markets simultaneously. And, furthermore, the student is taught that such an equilibrium has desirable welfare properties. Importantly, the student is told that since the 1970s it has been known that whilst such a system of equilibrium prices may exist, we cannot show that the economy would ever reach an equilibrium nor that such an equilibrium is unique.

The student then moves on to macroeconomics and is told that the aggregate economy or market behaves just like the average individual she has just studied. She is not told that these general models in fact poorly reflect reality. For the macroeconomist, this is a boon since he can now analyse the aggregate allocations in an economy as though they were the result of the rational choices made by one individual. The student may find this even more difficult to swallow when she is aware that peoples’ preferences, choices and forecasts are often influenced by those of the other participants in the economy. Students take a long time to accept the idea that the economy’s choices can be assimilated to those of one individual.

Alan Kirman What’s the use of economics?

An economic theory that does not go beyond proving theorems and conditional ‘if-then’ statements — and does not make assertions and put forward hypotheses about real-world individuals and institutions — is of little consequence for anyone wanting to use theories to better understand, explain or predict real-world phenomena.

Building theories and models on patently ridiculous assumptions we know people never conform to, does not deliver real science. Real and reasonable people have no reason to believe in ‘as-if’ models of ‘rational’ robot imitations acting and deciding in a Walt Disney world characterised by ‘common knowledge,’ ‘full information,’ ‘rational expectations,’ zero transaction costs, given stochastic probability distributions, risk-reduced genuine uncertainty, and other laughable nonsense assumptions of the same ilk. Science fiction is not science.

For decades now, economics students have been complaining about the way economics is taught. Their complaints are justified. Force-feeding young and open-minded people with unverified and useless theories and models cannot be the right way to develop a relevant and realist economic science.

Much work done in mainstream theoretical economics is devoid of any explanatory interest. And not only that. Seen from a strictly scientific point of view, it has no value at all. It is a waste of time. And as so many have been experiencing in modern times of austerity policies and market fundamentalism — a very harmful waste of time.

Twenty fallacies of modern economics

23 Mar, 2024 at 16:56 | Posted in Economics | Comments Off on Twenty fallacies of modern economics

11) To criticise/oppose the current mathematical modelling emphasis is to adopt an antiscience stance.

c9dd533b1cb4e7a2e1d6569481907beeIt is not. Mathematics is not essential (or inessential) to science; science involves using tools that are appropriate to the given task. A science of economics is perfectly feasible, and the current emphasis on mathematical modelling in economics serves, given the nature of social reality, mostly to prevent that potential from being realised.

14) Methods of mathematical modelling are, even if unnecessary, used in a neutral fashion, serving as just another language or heuristic device.

They are not used in a neutral fashion. They are tools. And like all tools they are appropriate for some tasks and conditions and not others. In certain contexts tools used inappropriately can be positively harmful. This has been (and is usually) the case with the application of mathematical methods in economics. It has forced the discipline into irrelevancy at best, whilst diverting resources away from potentially insightful alternative projects and applications. The claim that the mathematical methods adopted by economists are, or might conceivably be, employed as useful heuristic devices, serves, in the main, merely as an apology for this unhappy affair.

15) Thought-to-be false assumptions and questionable modelling methods are justified and so useable if/where they generate agreeable conclusions, or anyway conclusions held to be true.

This is incorrect, though seemingly widely believed even, or perhaps especially, amongst heterodox economists critical of the mainstream. That is, heterodox economists frequently suppose that although their modelling assumptions are (necessarily) false, their models are better (than those of their opponents) because the conclusions generated are held to be true. It may be true that ‘all polar bears are white’. But if this apparent truth is deductively generated from the assumptions that ‘all polar bears eat snow’ and ‘all snow-eaters are white’, we have added nothing to our understanding of polar bears, snow or whiteness; and nor have we provided explanatory support for the proposition that ‘all polar bears are white’. All deductive exercises that are so based on known absurd fictions, and this inevitably includes almost all mathematical modelling exercises in modern economics, are just as pointless. Certainly they add little to our understanding of social reality.

Tony Lawson

The overarching flaw with the economic approach using methodological individualism and rational choice theory is basically that they reduce social explanations to purportedly individual characteristics. But many of the characteristics and actions of the individual originate in and are made possible only through society and its relations. Society is not a Wittgensteinian ‘Tractatus-world’ characterized by atomistic states of affairs. Society is not reducible to individuals, since the social characteristics, forces, and actions of the individual are determined by pre-existing social structures and positions. Even though society is not a volitional individual, and the individual is not an entity given outside of society, the individual (actor) and the society (structure) have to be kept analytically distinct. They are tied together through the individual’s reproduction and transformation of already given social structures.

Since at least the marginal revolution in economics in the 1870s it has been an essential feature of economics to ‘analytically’ treat individuals as essentially independent and separate entities of action and decision. But, really, in such a complex, organic and evolutionary system as an economy, that kind of independence is a deeply unrealistic assumption to make. Simply assuming that there is strict independence between the variables we try to analyze doesn’t help us the least if that hypothesis turns out to be unwarranted.

To be able to apply the ‘analytical’ approach, economists have to basically assume that the universe consists of ‘atoms’ that exercise their own separate and invariable effects in such a way that the whole consists of nothing but an addition of these separate atoms and their changes. These simplistic assumptions of isolation, atomicity, and additivity are, however, at odds with reality. In real-world settings, we know that the ever-changing contexts make it futile to search for knowledge by making such reductionist assumptions. Real-world individuals are not reducible to contentless atoms and so not susceptible to atomistic analysis. The world is not reducible to a set of atomistic ‘individuals’ and ‘states.’ How variable X works and influences real-world economies in situation A cannot simply be assumed to be understood or explained by looking at how X works in situation B. Knowledge of X probably does not tell us much if we do not take into consideration how it depends on Y and Z. It can never be legitimate just to assume that the world is ‘atomistic.’ Assuming real-world additivity cannot be the right thing to do if the things we have around us rather than being ‘atoms’ are ‘organic’ entities.

If we want to develop new and better economics we have to give up on the single-minded insistence on using a deductivist straitjacket methodology and the ‘analytical’ method. To focus scientific endeavours on proving things in models is a gross misapprehension of the purpose of economic theory. Deductivist models and ‘analytical’ methods disconnected from reality are not relevant to predicting, explaining or understanding real-world economies

To have ‘consistent’ models and ‘valid’ evidence is not enough. What economics needs are real-world relevant models and sound evidence. Aiming only for ‘consistency’ and ‘validity’ is setting the economics aspirations level too low for developing a realist and relevant science.

Economics is not mathematics or logic. It’s about society. The real world.

Models may help us think through problems. But we should never forget that the formalism we use in our models is not self-evidently transportable to a largely unknown and uncertain reality. The tragedy with mainstream economic theory is that it thinks that the logic and mathematics used are sufficient for dealing with our real-world problems. They are not! Model deductions based on questionable assumptions can never be anything but pure exercises in hypothetical reasoning.

The world in which we live is inherently uncertain and quantifiable probabilities are the exception rather than the rule. To every statement about it is attached a ‘weight of argument’ that makes it impossible to reduce our beliefs and expectations to a one-dimensional stochastic probability distribution. If “God does not play dice” as Einstein maintained, I would add “nor do people.” The world as we know it has limited scope for certainty and perfect knowledge. Its intrinsic and almost unlimited complexity and the interrelatedness of its organic parts prevent the possibility of treating it as constituted by ‘legal atoms’ with discretely distinct, separable and stable causal relations. Our knowledge accordingly has to be of a rather fallible kind.

If the real world is fuzzy, vague and indeterminate, then why should our models build upon a desire to describe it as precise and predictable? Even if there always has to be a trade-off between theory-internal validity and external validity, we have to ask ourselves if our models are relevant.

‘Human logic’ has to supplant the classical — formal — logic of deductivism if we want to have anything of interest to say of the real world we inhabit. Logic is a marvellous tool in mathematics and axiomatic-deductivist systems, but a poor guide for action in real-world systems, in which concepts and entities are without clear boundaries and continually interact and overlap. In this world, I would say we are better served with a methodology that takes into account that the more we know, the more we know we do not know.

Clara Mattei och åtstramningspolitiken

22 Mar, 2024 at 14:58 | Posted in Economics | Comments Off on Clara Mattei och åtstramningspolitiken

I veckans avsnitt av Starta Pressarna intervjuas Clara E. Mattei — författare till boken Kapitalets ordning: Hur ekonomer skapade åtstramningsdoktrinen och banade väg för fascismen  (Verbal förlag, 2023) — och diskuteras hennes tes om att den ekonomiska åtstramningspolitiken historiskt växt fram som ett sätt att hålla tillbaka arbetarklassen. Som alltid på den här podden — både intressant och tankeväckande!

För många konservativa och nyliberala politiker och ekonomer verkar det finnas ett spöke som hemsöker Förenta Staterna och Europa idag — Keynesianska idéer om att regeringar bör bedriva politik för att öka effektiv efterfrågan och stödja sysselsättningen. Några av de favoritargument som används av dessa Keynesfobiker för att bekämpa det är ‘läran om sunda finanser’ och behovet av åtstramning.

Är denna kamp mot ekonomiskt förnuft ny? Inte alls. Om inte dagens mainstream-ekonomer hade haft en så  nästintill obefintlig kännedom om ekonomins idéhistoria skulle de säkert ha stött på Michal Kaleckis klassiska artikel från 1943 (som i grund och botten ger samma svar på de frågor som ställts av mig själv — och Clara Mattei –):

It should be first stated that, although most economists now agree that full employment may be achieved by government spending, this was by no means the case even in the recent past. Among the opposers of this doctrine there were (and still are) prominent so-called ‘economic experts’ closely connected with banking and industry. This suggests that there is a political background in the opposition to the full employment doctrine, even though the arguments advanced are economic. That is not to say that people who advance them do not believe in their economics, poor though this is. But obstinate ignorance is usually a manifestation of underlying political motives …

This gives the capitalists a powerful indirect control over government policy: everything which may shake the state of confidence must be carefully avoided because it would cause an economic crisis. But once the government learns the trick of increasing employment by its own purchases, this powerful controlling device loses its effectiveness. Hence budget deficits necessary to carry out government intervention must be regarded as perilous. The social function of the doctrine of ‘sound finance’ is to make the level of employment dependent on the state of confidence.

Michal Kalecki Political aspects of full employment

Clara Matteis Kapitalets ordning är en bok som jag på många sätt uppskattar som en kritisk studie av den historiska bakgrunden till åtstramningspolitikerna i Storbritannien och Italien. När det gäller bilden av Keynes hon vill förmedla finner jag dock att den på några rätt avgörande punkter blir ganska missvisande. Detta sagt är boken en likväl synnerligen väldokumenterad och läsvärd kritik av tankar och ideologier som än idag tyvärr hemsöker vår värld.

Methodology — the main problem with mainstream economics

20 Mar, 2024 at 18:36 | Posted in Economics | Comments Off on Methodology — the main problem with mainstream economics

Research Memes | The LoveStats BlogThe basic problems mostly originate at the level of methodology, and in particular with the current emphasis on methods of mathematical modelling. The latter emphasis is an error given the lack of match of the methods in question to the conditions in which they are applied. So long as the critical focus remains only, or even mainly or centrally, at the level of substantive economic theory and/or policy matters, then no amount of alternative text books, popular monographs, introductory pocketbooks, journal and magazine articles, newspaper columns, blogs, even student protests, petitions, and ‘reclaiming economics’ campaigns and events, new institutes and/or centres, alternative programmes, conferences, workshops, plenary speeches, videos, comic strips, or whatever, are going to get at the nub of the problems and so have the wherewithal to help make economics a sufficiently relevant discipline. It is the methods and the manner of their usage that are the basic problem.

The point is simply that all methods are appropriate under some conditions but not others. Hammers and pens have their uses. But if the task at hand is, say, to mow the lawn neither a hammer nor a pen is likely to be up to the job. Similarly the sorts of mathematical methods economists insist upon have their uses. But social analysis is not one of them. This is because the methods in questions, to be successful, require closed systems, i.e. those in which correlations occur, where the guaranteeing of the latter closures require worlds of isolated atoms …

It is easy enough to demonstrate that social reality is not like this. Rather social reality is generally open, with everything (from social structures to embodied personalities) in process, being transformed through human practice (thus undermining atomism), with all aspects constituted in relation to (and not merely linked to and certainly not organised independently of) each other (thus undermining any requirement of isolationism).

Tony Lawson

Many economists have over time tried to diagnose what’s the problem behind the ‘intellectual poverty’ that characterizes modern mainstream economics. The questionable reduction of uncertainty into probabilistic risk, rationality postulates, rational expectations, market fundamentalism, general equilibrium, atomism, and over-mathematisation, are some things one has been pointing at. But although these assumptions/axioms/practices are deeply problematic, they are mainly reflections of a deeper and more fundamental problem.

The main problem with mainstream economics is its methodology.

The fixation on constructing models showing the certainty of logical entailment has been detrimental to the development of relevant and realist economics. Insisting on formalistic (mathematical) modelling forces the economist to give up on realism and substitute axiomatics for real-world relevance. The price for rigour and precision is far too high for anyone who is ultimately interested in using economics to pose and answer real-world questions and problems.

This deductivist orientation is, as argued by Lawson, the main reason behind the difficulty that mainstream economics has in terms of understanding, explaining and predicting what takes place in our societies. But it has also given mainstream economics much of its discursive power — at least as long as no one starts asking tough questions on the veracity of — and justification for — the assumptions on which the deductivist foundation is erected. Asking these questions is an important ingredient in a sustained critical effort to show how nonsensical the embellishing of a smorgasbord of models founded on wanting (often hidden) methodological foundations is.

The mathematical-deductivist straitjacket used in mainstream economics presupposes atomistic closed systems — i.e., something that we find very little of in the real world, a world significantly at odds with an (implicitly) assumed logic world where deductive entailment rules the roost. Ultimately then, the failings of modern mainstream economics have their roots in a deficient ontology. The kind of formal-analytical and axiomatic-deductive mathematical modelling that makes up the core of mainstream economics is hard to make compatible with a real-world ontology. It is also the reason why so many critics find mainstream economic analysis patently and utterly unrealistic and irrelevant.

Although there has been a clearly discernible increase and focus on “empirical” economics in recent decades, the results in these research fields have not fundamentally challenged the main deductivist direction of mainstream economics. They are still mainly framed and interpreted within the core ‘axiomatic’ assumptions of individualism, instrumentalism and equilibrium that make up even the ‘new’ mainstream economics. Although, perhaps, a sign of an increasing — but highly path-dependent — theoretical pluralism, mainstream economics is still, from a methodological point of view, mainly a deductive project erected on a foundation of empty formalism.

If we want theories and models to confront reality there are obvious limits to what can be said ‘rigorously’ in economics. Although it is generally a good aspiration to search for scientific claims that are both rigorous and precise, we have to accept that the chosen level of precision and rigour must be relative to the subject matter studied. An economics that is relevant to the world in which we live can never achieve the same degree of rigour and precision as in logic, mathematics or the natural sciences. Collapsing the gap between model and reality in that way will never give anything else than empty formalist economics.

In mainstream economics, with its addiction to the deductivist approach of formal-mathematical modelling, model consistency trumps coherence with the real world. That is surely getting the priorities wrong. Creating models for their own sake is not an acceptable scientific aspiration — impressive-looking formal-deductive models should never be mistaken for truth. It is still a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in those kinds of theories and models is beyond imagination. As long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science.

When applying deductivist thinking to economics, economists usually set up “as if” models based on a set of tight axiomatic assumptions from which consistent and precise inferences are made. The beauty of this procedure is of course that if the axiomatic premises are true, the conclusions necessarily follow. The snag is that if the models are to be relevant, we also have to argue that their precision and rigour still hold when they are applied to real-world situations. They often don’t. When addressing real economies, the idealizations necessary for the deductivist machinery to work, simply don’t hold.

So how should we evaluate the search for ever greater precision and the concomitant arsenal of mathematical and formalist models? To a large extent, the answer hinges on what we want our models to perform and how we basically understand the world.

To search for precision and rigour in such a world is self-defeating, at least if precision and rigour are supposed to assure external validity. The only way to defend such an endeavour is to turn a blind eye to ontology and restrict oneself to proving things in closed model worlds. Why we should care about these and not ask questions of relevance is hard to see. We have to at least justify our disregard for the gap between the nature of the real world and our theories and models of it.

If economics is going to be useful, it has to change its methodology. Economists have to get out of their deductivist theoretical ivory towers and start asking questions about the real world. A relevant economics science presupposes adopting methods suitable to the object it is supposed to predict, explain or understand.

Top 25 Heterodox Economics Books

19 Mar, 2024 at 22:21 | Posted in Economics | Comments Off on Top 25 Heterodox Economics Books
  • Karl Marx, Das Kapital (1867)
  • Thorstein Veblen, The Theory of the Leisure Class (1899)
  • Joseph Schumpeter, The Theory of Economic Development (1911)
  • Nikolai Kondratiev, The Major Economic Cycles (1925)
  • Gunnar Myrdal, The Political Element in the Development of Economic Theory (1930)
  • John Maynard Keynes, The General Theory (1936)
  • Karl Polanyi, The Great Transformation (1944)
  • Paul Sweezy, Theory of Capitalist Development (1956)
  • Joan Robinson, Accumulation of Capital (1956)
  • John Kenneth Galbraith, The Affluent Society (1958)
  • Piero Sraffa, Production of Commodities by Means of Commodities (1960)
  • Johan Åkerman, Theory of Industrialism (1961)
  • Nicholas Georgescu-Roegen, The Entropy Law and the Economic Process (1971)
  • Michal Kalecki, Selected Essays on the Dynamics of the Capitalist Economy (1971)
  • Paul Davidson, Money and the Real World (1972)
  • Hyman Minsky, John Maynard Keynes (1975)
  • Charles P Kindleberger, Manias, Panics and Crashes (1976)
  • Geoff Hodgson, Economics and Institutions (1988)
  • Philip Mirowski, More Heat than Light (1989)
  • Tony Lawson, Economics and Reality (1997)
  • Steve Keen, Debunking Economics (2001)
  • L Randall Wray, Modern Money Theory  (2012)
  • Thomas Piketty, Capital in the Twenty-First Century (2014)
  • Anwar Shaikh, Capitalism: competition, conflict, crises (2016)
  • Stephanie Kelton, The Deficit Myth (2020)

Keynes and Ramsey on probability

19 Mar, 2024 at 14:10 | Posted in Economics | Comments Off on Keynes and Ramsey on probability

.

Although Blackburn on the whole gives a succinct and correct picture of Keynes’s view on probability, I think it’s necessary to somewhat qualify in what way and to what extent Keynes “lost” the debate with Frank Ramsey.

In economics, it’s an indubitable fact that few mainstream neoclassical economists work within the Keynesian paradigm. All more or less subscribe to some variant of Bayesianism. And some even say that Keynes acknowledged he was wrong when presented with Ramsey’s theory. This is a view that has unfortunately also been promulgated by Robert Skidelsky in his otherwise masterly biography of Keynes. But I think it’s fundamentally wrong. Let me elaborate on this point (the argumentation is more fully presented in my book John Maynard Keynes (SNS, 2007)).

It’s a debated issue in newer research on Keynes if he, as some researchers maintain, fundamentally changed his view on probability after the critique levelled against his A Treatise on Probability by Frank Ramsey. It has been exceedingly difficult to present evidence for this being the case.

Ramsey’s critique was mainly that the kind of probability relations that Keynes was speaking of in Treatise actually didn’t exist and that Ramsey’s own procedure  (betting) made it much easier to find out the “degrees of belief” people were having. I question this both from a descriptive and a normative point of view.

Keynes is saying in his response to Ramsey only that Ramsey “is right” in that people’s “degrees of belief” basically emanate in human nature rather than in formal logic.

Patrick Maher, former professor of philosophy at the University of Illinois, even suggests that Ramsey’s critique of Keynes’s probability theory in some regards is invalid:

Keynes’s book was sharply criticized by Ramsey. In a passage that continues to be quoted approvingly, Ramsey wrote:

“But let us now return to a more fundamental criticism of Mr. Keynes’ views, which is the obvious one that there really do not seem to be any such things as the probability relations he describes. He supposes that, at any rate in certain cases, they can be perceived; but speaking for myself I feel confident that this is not true. I do not perceive them, and if I am to be persuaded that they exist it must be by argument; moreover, I shrewdly suspect that others do not perceive them either, because they are able to come to so very little agreement as to which of them relates any two given propositions.” (Ramsey 1926, 161)

I agree with Keynes that inductive probabilities exist and we sometimes know their values. The passage I have just quoted from Ramsey suggests the following argument against the existence of inductive probabilities. (Here P is a premise and C is the conclusion.)

P: People are able to come to very little agreement about inductive proba- bilities.
C: Inductive probabilities do not exist.

P is vague (what counts as “very little agreement”?) but its truth is still questionable. Ramsey himself acknowledged that “about some particular cases there is agreement” (28) … In any case, whether complicated or not, there is more agreement about inductive probabilities than P suggests …

I have been evaluating Ramsey’s apparent argument from P to C. So far I have been arguing that P is false and responding to Ramsey’s objections to unmeasurable probabilities. Now I want to note that the argument is also invalid. Even if P were true, it could be that inductive probabilities exist in the (few) cases that people generally agree about. It could also be that the disagreement is due to some people misapplying the concept of inductive probability in cases where inductive probabilities do exist. Hence it is possible for P to be true and C false …

I conclude that Ramsey gave no good reason to doubt that inductive probabilities exist.

Ramsey’s critique made Keynes more strongly emphasize the individuals’ own views as the basis for probability calculations, and less stress that their beliefs were rational. But Keynes’s theory doesn’t stand or fall with his view on the basis for our “degrees of belief” as logical. The core of his theory — when and how we can measure and compare different probabilities —he doesn’t change. Unlike Ramsey, he wasn’t at all sure that probabilities always were one-dimensional, measurable, quantifiable or even comparable entities

Austrian economics — a methodological critique

17 Mar, 2024 at 14:47 | Posted in Economics | Comments Off on Austrian economics — a methodological critique

.

This is a fair presentation and critique of Austrian methodology.

But beware!

In theoretical and methodological questions it is not always either-or. We have to be open-minded and pluralistic enough not to throw out the baby with the bath water — and fail to secure insights like this:

What is the problem we wish to solve when we try to construct a rational economic order? … If we possess all the relevant information, if we can start out from a given system of preferences, and if we command complete knowledge of available means, the problem which remains is purely one of logic …

The-Use-of-Knowledge-in-Society_800x600-05_2014-172x230This, however, is emphatically not the economic problem which society faces … The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is … a problem of the utilization of knowledge which is not given to anyone in its totality.

This character of the fundamental problem has, I am afraid, been obscured rather than illuminated by many of the recent refinements of economic theory … Many of the current disputes with regard to both economic theory and economic policy have their common origin in a misconception about the nature of the economic problem of society. This misconception in turn is due to an erroneous transfer to social phenomena of the habits of thought we have developed in dealing with the phenomena of nature …

To assume all the knowledge to be given to a single mind in the same manner in which we assume it to be given to us as the explaining economists is to assume the problem away and to disregard everything that is important and significant in the real world.

Compare this relevant and realist wisdom with the rational expectations hypothesis (REH) used by almost all mainstream macroeconomists today. REH presupposes — basically for reasons of consistency — that agents have complete knowledge of all of the relevant probability distribution functions. When trying to incorporate learning in these models — trying to take the heat of some of the criticism launched against it up to date — it is always a very restricted kind of learning that is considered. Where truly unanticipated, surprising, new things never take place, but only rather mechanical updatings — increasing the precision of already existing information sets — of existing probability functions.

Nothing really new happens in these ergodic models, where the statistical representation of learning and information is nothing more than a caricature of what takes place in the real-world target system. This follows from taking for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution) that can be thought of taking place.

The rational expectations hypothesis presumes consistent behaviour, where expectations do not display any persistent errors. In the world of rational expectations, we are always, on average, hitting the bull’s eye. In the more realistic, open systems view, there is always the possibility (danger) of making mistakes that may turn out to be systematic. It is because of this, presumably, that we put so much emphasis on learning in our modern knowledge societies.

As Hayek wrote:

When it comes to the point where [equilibrium analysis] misleads some of our leading thinkers into believing that the situation which it describes has direct relevance to the solution of practical problems, it is high time that we remember that it does not deal with the social process at all and that it is no more than a useful preliminary to the study of the main problem.

Angus Deaton rethinking economics

14 Mar, 2024 at 14:49 | Posted in Economics | 1 Comment

deaton | Princeton School of Public and International AffairsLike many others, I have recently found myself changing my mind, a discomfiting process for someone who has been a practicing economist for more than half a century. I will come to some of the substantive topics, but I start with some general failings. I do not include the corruption allegations that have become common in some debates. Even so, economists, who have prospered mightily over the past half century, might fairly be accused of having a vested interest in capitalism as it currently operates. I should also say that I am writing about a (perhaps nebulous) mainstream, and that there are many nonmainstream economists.

  • Power: Our emphasis on the virtues of free, competitive markets and exogenous technical change can distract us from the importance of power in setting prices and wages, in choosing the direction of technical change, and in influencing politics to change the rules of the game. Without an analysis of power, it is hard to understand inequality or much else in modern capitalism.
  • Philosophy and ethics: In contrast to economists from Adam Smith and Karl Marx through John Maynard Keynes, Friedrich Hayek, and even Milton Friedman, we have largely stopped thinking about ethics and about what constitutes human well-being. We are technocrats who focus on efficiency. We get little training about the ends of economics, on the meaning of well-being—welfare economics has long since vanished from the curriculum—or on what philosophers say about equality. When pressed, we usually fall back on an income-based utilitarianism. We often equate well-being with money or consumption, missing much of what matters to people. In current economic thinking, individuals matter much more than relationships between people in families or in communities.
  • Efficiency is important, but we valorize it over other ends. Many subscribe to Lionel Robbins’ definition of economics as the allocation of scarce resources among competing ends or to the stronger version that says that economists should focus on efficiency and leave equity to others, to politicians or administrators. But the others regularly fail to materialize, so that when efficiency comes with upward redistribution—frequently though not inevitably—our recommendations become little more than a license for plunder. Keynes wrote that the problem of economics is to reconcile economic efficiency, social justice, and individual liberty. We are good at the first, and the libertarian streak in economics constantly pushes the last, but social justice can be an afterthought. After economists on the left bought into the Chicago School’s deference to markets—“we are all Friedmanites now”—social justice became subservient to markets, and a concern with distribution was overruled by attention to the average, often nonsensically described as the “national interest.”
  • Empirical methods: The credibility revolution in econometrics was an understandable reaction to the identification of causal mechanisms by assertion, often controversial and sometimes incredible. But the currently approved methods, randomized controlled trials, differences in differences, or regression discontinuity designs, have the effect of focusing attention on local effects, and away from potentially important but slow-acting mechanisms that operate with long and variable lags. Historians, who understand about contingency and about multiple and multidirectional causality, often do a better job than economists of identifying important mechanisms that are plausible, interesting, and worth thinking about, even if they do not meet the inferential standards of contemporary applied economics.
  • Humility: We are often too sure that we are right. Economics has powerful tools that can provide clear-cut answers, but that require assumptions that are not valid under all circumstances. It would be good to recognize that there are almost always competing accounts and learn how to choose between them …

Economists could benefit by greater engagement with the ideas of philosophers, historians, and sociologists, just as Adam Smith once did. The philosophers, historians, and sociologists would likely benefit too.

Angus Deaton

A great article by a great economist!

Yours truly basically agrees with Deaton’s criticism of the general shortcomings of mainstream economics, but let me still comment on the specific criticism of ’empirical methods’.

In mainstream economics, there has been a growing interest in experiments and — not least — how to design them to possibly provide answers to questions about causality and policy effects. Economic research on discrimination nowadays often emphasizes the importance of a randomization design, for example when trying to determine to what extent discrimination can be causally attributed to differences in preferences or information, using so-called correspondence tests and field experiments.

A common starting point is the ‘counterfactual approach’ (developed mainly by Jerzy Neyman and Donald Rubin) which is often presented and discussed based on examples of randomized control studies, natural experiments, difference-in-differences, matching, regression discontinuity, etc.

Mainstream economists generally view this development of the economics toolbox positively. Yours truly — like Angus Deaton — is not entirely positive about the randomization approach.

A notable limitation of counterfactual randomization designs is that they only give us answers on how ‘treatment groups’ differ on average from ‘control groups.’ Let me give an example to illustrate how limiting this fact can be:

Among school debaters and politicians in Sweden, it is claimed that so-called ‘independent schools’ (charter schools) are better than municipal schools. They are said to lead to better results. To find out if this is really the case, a number of students are randomly selected to take a test. The result could be: Test result = 20 + 5T, where T=1 if the student attends an independent school and T=0 if the student attends a municipal school. This would confirm the assumption that independent school students have an average of 5 points higher results than students in municipal schools. Now, politicians (hopefully) are aware that this statistical result cannot be interpreted in causal terms because independent school students typically do not have the same background (socio-economic, educational, cultural, etc.) as those who attend municipal schools (the relationship between school type and result is confounded by selection bias). To obtain a better measure of the causal effects of school type, politicians suggest that 1000 students be admitted to an independent school through a lottery — a classic example of a randomization design in natural experiments. The chance of winning is 10%, so 100 students are given this opportunity. Of these, 20 accept the offer to attend an independent school. Of the 900 lottery participants who do not ‘win,’ 100 choose to attend an independent school. The lottery is often perceived by school researchers as an ‘instrumental variable,’ and when the analysis is carried out, the result is: Test result = 20 + 2T. This is standardly interpreted as having obtained a causal measure of how much better students would, on average, perform on the test if they chose to attend independent schools instead of municipal schools. But is it true? No! If not all school students have exactly the same test results (which is a rather far-fetched ‘homogeneity assumption’), the specified average causal effect only applies to the students who choose to attend an independent school if they ‘win’ the lottery, but who would not otherwise choose to attend an independent school (in statistical jargon, we call these ‘compliers’). It is difficult to see why this group of students would be particularly interesting in this example, given that the average causal effect estimated using the instrumental variable says nothing at all about the effect on the majority (the 100 out of 120 who choose to attend an independent school without ‘winning’ in the lottery) of those who choose to attend an independent school.

Conclusion: Researchers must be much more careful in interpreting ‘average estimates’ as causal. Reality exhibits a high degree of heterogeneity, and ‘average parameters’ often tell us very little!

To randomize ideally means that we achieve orthogonality (independence) in our models. But it does not mean that in real experiments when we randomize, we achieve this ideal. The ‘balance’ that randomization should ideally result in cannot be taken for granted when the ideal is translated into reality. Here, one must argue and verify that the ‘assignment mechanism’ is truly stochastic and that ‘balance’ has indeed been achieved!

Even if we accept the limitation of only being able to say something about average treatment effects there is another theoretical problem. An ideal randomized experiment assumes that a number of individuals are first chosen from a randomly selected population and then randomly assigned to a treatment group or a control group. Given that both selection and assignment are successfully carried out randomly, it can be shown that the expected outcome difference between the two groups is the average causal effect in the population. The snag is that the experiments conducted rarely involve participants selected from a random population! In most cases, experiments are started because there is a problem of some kind in a given population (e.g., schoolchildren or job seekers in country X) that one wants to address. An ideal randomized experiment assumes that both selection and assignment are randomized — this means that virtually none of the empirical results that randomization advocates so eagerly tout hold up in a strict mathematical-statistical sense. The fact that only assignment is talked about when it comes to ‘as if’ randomization in natural experiments is hardly a coincidence. Moreover, when it comes to ‘as if’ randomization in natural experiments, the sad but inevitable fact is that there can always be a dependency between the variables being studied and unobservable factors in the error term, which can never be tested!

Another significant and major problem is that researchers who use these randomization-based research strategies often set up problem formulations that are not at all the ones we really want answers to, to achieve ‘exact’ and ‘precise’ results. Design becomes the main thing, and as long as one can get more or less clever experiments in place, they believe they can draw far-reaching conclusions about both causality and the ability to generalize experimental outcomes to larger populations. Unfortunately, this often means that this type of research has a negative bias away from interesting and important problems towards prioritizing method selection. Design and research planning are important, but the credibility of research ultimately lies in being able to provide answers to relevant questions that both citizens and researchers want answers to.

Believing there is only one really good evidence-based method on the market — and that randomization is the only way to achieve scientific validity — blinds people to searching for and using other methods that in many contexts are better. Insisting on using only one tool often means using the wrong tool.

Ideology and the politics of economic method

14 Mar, 2024 at 08:59 | Posted in Economics | 1 Comment

Political economy has long taken a keen interest in the politics of economic ideas, but considerably less attention has been paid to the politics of economic method. Method gets neglected as the technical realm within which, it is assumed economic ideas, once established, are implemented in straightforward fashion. In fact, economic method and technique are in fact key sites in the battle of economic ideas …

The economic consequences of tax cuts for the rich | LARS P. SYLLIndependent fiscal councils and Central Banks have been introduced in many countries in recent decades. Technocratic economic governance, involving expert oversight of and input into often rules-based economic policy, has become pervasive in advanced democracies. Governments introduced economic policy rules, and independent oversight bodies, in an effort to reassure electorates and financial markets that they were sound custodians of the economy and the public finances.

This was, according to Public Choice theory at least, designed to take some of the politics out of economic policy-making, turning it into a mechanistic administrative process. However, this technocratic vision comes up against a central reality of political economy: economic knowledge and narratives are political and social constructs.

The operations of the Office for Budget Responsibility (OBR), the UK’s fiscal watchdog, draw our attention to the often under-appreciated politics of technocratic economic governance. Although the OBR sees itself as apolitical institution, and involved in technical work, on closer inspection – the OBR is nevertheless inextricably involved in the politics of economic policy-making.

Indeed, there is always a politics of technocratic economic governance because economic analysis and policy evaluation rest on political economic assumptions that are always contestable. Bodies like the OBR and the IMF, in their operational work, deal in contrasting normatively informed accounts of how the economy and policy work – built in via the assumptive foundations of the various models they operate with …

We can take research on the politics of economic ideas forward by highlighting the importance of economic method, and modelling assumptions, as sites of contestation within economic governance and economic policy-making. Economic concepts used to gauge growth trajectories and frame and pilot economic policy, even when operationalised and deployed by technocratic bodies like the OBR or IMF, are always founded upon contestable normative assumptions.

Since the global financial crisis, we increasingly lack a single agreed fiscal policy script or settled expert view. Rather, understandings on fiscal policy efficacy, the properties of markets, and the impacts of macroeconomic policy on long-term growth differ. There is a spectrum of respectable opinion – drawn from different economic theoretical homes. This makes deliberation and contestation over economic methods and modelling assumptions even more significant, and consequential for policy.

A politics of economic method lens helps appreciate how apparently technocratic economic governance, carried out by bodies like the OBR and the IMF, is saturated with the politics of economic ideas.

Ben Clift

You never hear anyone at our seminars telling the lecturer that the assumptions on which his models are built are only made for ideological reasons. But that does not necessarily mean — whether on the surface or not — that academic analysis is judged on its merits. What it means is that we have a catechism that no one dares to question. And that catechism has become hegemonic for particular reasons, one of which may very well be of an ideological nature. When the neoclassical theory was developed in the late 19th century one of the reasons was that some economists — e.g. Böhm-Bawerk — thought that the Ricardian (labour value) tradition had become too radical and could be used as a dangerous weapon in the class struggle. Marginalism was explicitly seen as a way to counter that.

Even though some economists seem to think that facts are bound to win in the end, yours truly begs to differ.

Take the rational expectations assumption. Rational expectations in the mainstream economists’ world imply that relevant distributions have to be time-independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality, it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realisation out of an ensemble of economy-worlds since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis, we are never disappointed in any other way than when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours.

‘Rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

So, given this lack of empirical evidence, why do mainstream economists still stick to using these kinds of theories and models building on blatantly ridiculous assumptions? Well, one reason, as argued by Ben Clift, is of an ideological nature. Those models and the assumptions they build on standardly have a neoliberal or market-friendly bias. I guess that is also one of the — ideological — reasons those models and theories are so dear to many Chicago and ‘New Keynesian’ economists …

‘New Keynesian’ unemployment — a paid vacation essentially!

11 Mar, 2024 at 15:42 | Posted in Economics | 15 Comments

Franco Modigliani famously quipped that he did not think that unemployment during the Great Depression should be described, in an economic model, as a “sudden bout of contagious laziness”. Quite. For the past thirty years we have been debating whether to use classical real business cycle models (RBC), or their close cousins, modern New Keynesian (NK) models, to describe recessions. In both of these models, the social cost of persistent unemployment is less than a half a percentage point of steady state consumption.

0a7fb63c47d95f3138a81e711dabe9d3959138340aa3e78d26336fd2fab0f6b9What does that mean? Median US consumption is roughly $30,000 a year. One half of one percent of this is roughly 50 cents a day. A person inhabiting one of our artificial model RBC or NK model worlds, would not be willing to pay more than 50 cents a day to avoid another Great Depression. That is true of real business cycle models. It is also true of New Keynesian models …

That’s why I eschew NK and RBC models. They are both wrong. The high unemployment that follows a financial crisis is not the socially efficient response to technology shocks. And the slow recovery from a financial melt-down has nothing to do with the costs of reprinting menus that underpins the models of NK economists. It is a potentially permanent failure of private agents to coordinate on an outcome that is socially desirable.

Roger Farmer

In the basic DSGE models used by both New Classical and ‘New Keynesian’ macroeconomists, the labour market is always cleared — responding to a changing interest rate, expected lifetime incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly — if the real wage somehow deviates from its ‘equilibrium value,’ the representative agent adjusts her labour supply, so that when the real wage is higher than its ‘equilibrium value,’ the labour supply is increased, and when the real wage is below its ‘equilibrium value,’ labour supply is decreased.

In this model world, unemployment is always an optimal choice for changes in labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be — a kind of prolonged vacation.

The D WordAlthough this picture of unemployment as a kind of self-chosen optimality, strikes most people as utterly ridiculous, there are also, unfortunately, a lot of mainstream economists out there who still think that price and wage rigidities are the prime movers behind unemployment. DSGE models basically explain variations in employment (and a fortiori output) assuming nominal wages are more flexible than prices – disregarding the lack of empirical evidence for this rather counterintuitive assumption.

Lowering nominal wages would not clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. It would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions were not seen as a general substitute for an expansionary monetary or fiscal policy. And even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of ongoing lowering of wages causing a delay of investments, debt deflation et cetera.

The classical proposition that lowering wages would lower unemployment and ultimately take economies out of depression was ill-founded and basically wrong. Flexible wages would probably only make things worse by leading to erratic price fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labour market.

Obviously, it’s rather embarrassing that the kind of DSGE models ‘modern’ macroeconomists use cannot incorporate such a basic fact of reality as involuntary unemployment. Of course, working with representative agent models, this should come as no surprise. The kind of unemployment that occurs is voluntary since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

And as if this is nonsense economics is not enough, in New Classical and ‘New Keynesian’ macroeconomists DSGE models increases in government spending lead to a drop in private consumption!

How on earth does one arrive at such as bizarre view?

In the most basic mainstream proto-DSGE models, one often assumes that governments finance current expenditures with current tax revenues.  This will have a negative income effect on the households, leading — rather counterintuitively — to a drop in private consumption although both employment and production expand. This mechanism also holds when the (in)famous Ricardian equivalence is added to the models.

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

Why?

In the standard neoclassical consumption model — used in DSGE macroeconomic modelling — people are basically portrayed as treating time as a dichotomous phenomenon  today and the future — when contemplating making decisions and acting. How much should one consume today and how much in the future? Facing an intertemporal budget constraint of the form

ct + cf/(1+r) = ft + yt + yf/(1+r),

where ct is consumption today, cf is consumption in the future, ft is holdings of financial assets today, yt is labour incomes today, yf is labour incomes in the future, and r is the real interest rate, and having a lifetime utility function of the form

U = u(ct) + au(cf),

where a is the time discounting parameter, the representative agent (consumer) maximizes his utility when

u'(ct) = a(1+r)u'(cf).

This expression – the Euler equation – implies that the representative agent (consumer) is indifferent between consuming one more unit today or instead consuming it tomorrow. Typically using a logarithmic function form – u(c) = log c – which gives u'(c) = 1/c, the Euler equation can be rewritten as

1/ct = a(1+r)(1/cf),

or

cf/ct = a(1+r).

This importantly implies that according to the neoclassical consumption model changes in the (real) interest rate and consumption move in the same direction. And — it also follows that consumption is invariant to the timing of taxes since wealth — ft + yt + yf/(1+r) — has to be interpreted as present discounted value net of taxes. And so, according to the assumption of Ricardian equivalence, the timing of taxes does not affect consumption, simply because the maximization problem as specified in the model is unchanged. As a result — households cut down on their consumption when governments increase their spending. Mirabile dictu!

Macroeconomic models have to abandon Ricardian equivalence nonsense. But replacing it with “overlapping generations” and “infinite-horizon” models — is– in terms of realism and relevance — just getting out of the frying pan into the fire. All unemployment is still voluntary. Intertemporal substitution between labour and leisure is still ubiquitous. And the specification of the utility function is still hopelessly off the mark from an empirical point of view.

As one Nobel laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter 

And as one economics blogger has it:

New Classical and ‘New Keynesian’ DSGE modeling is taught in every graduate school in the country. It is also sheer nonsense.

Lars P Syll, twitter 

Gatekeepers of the economics profession

9 Mar, 2024 at 10:35 | Posted in Economics | Comments Off on Gatekeepers of the economics profession

Secret Tactic ~ How to Get Past the Gatekeeper Using LinkedIn.The power structures within the profession reinforce the mainstream in different ways, including through the tyranny of so-called top journals and academic and professional employment. Such pressures and incentives divert many of the brightest minds from a genuine study of the economy (to try to understand its workings and the implications for people) to what can only be called “trivial pursuits.” Too many top academic journals publish esoteric contributions that add value only by relaxing one small assumption in a model or using a slightly different econometric test. Elements that are harder to model or generate inconvenient truths are simply excluded, even if they would contribute to a better understanding of economic reality. Fundamental constraints or outcomes are presented as “externalities” rather than as conditions to be addressed. Economists who talk mainly to each other, then simply proselytize their findings to policymakers, are rarely forced to question this approach.

As a result, economic forces that are necessarily complex—muddied with the impact of many different variables—and reflect the effects of history, society, and politics are not studied in light of this complexity. Instead, they are squeezed into mathematically tractable models, even if this removes any resemblance to economic reality. To be fair, some very successful mainstream economists have railed against this tendency—but with little effect thus far on the gatekeepers of the profession.

Jayathi Ghosh

Economics sure is a disgrace — the lack of diversity and pluralism discourages and often drives away talented young students.

In the 21st century, the blogosphere has without any doubt become one of the greatest channels for dispersing new knowledge and information. So when it comes to the mainstream near monopoly of academic journals we can at least see some light in the tunnel — bloggers have shown for the last two decades that it is possible to push past the old gatekeepers …

Incompetent economists

5 Mar, 2024 at 18:32 | Posted in Economics | Comments Off on Incompetent economists

Teaching economics students the fundamentals of the utility theory used in mainstream economics is a bit of a challenge. But I guess we all expect the professors to know what they are teaching …

At a meeting of the American Economic Association (AEA), 200 of the professional economists present were asked to answer the following question:

Meme Creator - Funny Incompetent Meme Generator at MemeCreator.org! “You won a free ticket to see an Eric Clapton concert (which has no resale value). Bob Dylan is performing on the same night and is your next-best alternative activity. Tickets to see Dylan cost $40. On any given day, you would be willing to pay up to $50 to see Dylan. Assume there are no other costs of seeing either performer. Based on this information, what is the opportunity cost of seeing Eric Clapton? (a) $0, (b) $10, (c) $40, or (d) $50.”

Only one in every five answered correctly (alternative b)!

So much for the competence of practitioners of the “queen of the social sciences” …

Finanspolitiken och de växande investeringsbehoven

5 Mar, 2024 at 15:25 | Posted in Economics | 4 Comments

.

Ett av de grundläggande feltänken i dagens diskussion om statsskuld och budgetunderskott är att man inte skiljer på skuld och skuld. Även om det på makroplanet av nödvändighet är så att skulder och tillgångar balanserar varandra, så är det inte oväsentligt vem som har tillgångarna och vem som har skulderna.

Länge har man varit motvillig att öka de offentliga skulderna eftersom ekonomiska kriser i mångt och mycket fortfarande uppfattas som förorsakade av för mycket skulder. Men det är här fördelningen av skulder kommer in. Om staten i en lågkonjunktur ‘lånar’ pengar för att bygga ut järnvägar, skola och hälsovård, så är ju de samhälleliga kostnaderna för detta minimala eftersom resurserna annars legat oanvända. När hjulen väl börjar snurra kan både de offentliga och de privata skulderna betalas av.

I stället för att ”värna om statsfinanserna” — med det finanspolitiska ramverkets i grunden feltänkta överskottsmål, skuldankare och utgiftstak — bör man se till att värna om samhällets framtid. Problemet med en statsskuld i en situation med historiskt låga räntor är inte att den är för stor, utan för liten.

Vad många politiker och mediala “experter” inte verkar (vilja) förstå är att det finns en avgörande skillnad mellan privata och offentliga skulder. Om en individ försöker spara och dra ner på sina skulder, så kan det mycket väl vara rationellt. Men om alla försöker göra det, blir följden att den aggregerade efterfrågan sjunker och arbetslösheten riskerar ökar.

En enskild individ måste alltid betala sina skulder. Men en stat kan alltid betala tillbaka sina gamla skulder med nya skulder. Staten är inte en individ. Statliga skulder är inte som privata skulder. En stats skulder är väsentligen en skuld till den själv, till dess medborgare (den offentliga sektorns finansiella nettoposition är positiv).

En statsskuld — idag ligger den i Sverige på drygt 20% av BNP — är varken bra eller dålig. Den ska vara ett medel att uppnå två övergripande makroekonomiska mål — full sysselsättning och prisstabilitet. Vad som är ‘heligt’ är inte att ha en balanserad budget eller att hålla nere den konsoliderade bruttoskulden (‘Maastrichtskulden’) till 35 % av BNP på medellång sikt. Om idén om ‘sunda’ statsfinanser leder till ökad arbetslöshet och instabila priser borde det vara självklart att den överges.

Den ­svenska utlandsskulden och den konsoliderade statsskulden är historiskt låga. Som framgår av ‘Maastrichtskulden’ hör Sverige till de EU-länder som har allra lägst offentlig skuldsättning. Med tanke på de stora utmaningar som Sverige står inför idag är fortsatt tal om “ansvar” för statsbudgeten minst sagt oansvarigt. I stället för att ”värna” om statsfinanserna bör en ansvarsfull rege­ring se till att värna om samhällets framtid.

Budgetunderskott och statsskuld är inte Sveriges problem idag. Och att fortsatt prata om att “spara i ladorna” är bara ren dumhet.

Idag behöver Sverige ett nytt finanspolitiskt ramverk. Vi behöver sänka målet för det finansiella sparandet för att därigenom ge ett större utrymme för kontracykliska finanspolitiska satsningar — och därigenom t ex kunna göra nödvändiga investeringar i infrastruktur, minska arbetslösheten. Med tanke på de klimatpolitiska utmaningar vi står inför, är det nödvändigt att på allvar göra upp med det ramverk som idag onödigt försvårar finanspolitiskt mer offensiva miljösatsningar.

Dagens inflationfixerade ekonomisk-politiska debatt visar med all tydlighet att ekonomer har en tendens att bli uppslukade av sina egna teorier och modeller, och glömmer bort att bakom siffrorna och abstraktionerna finns en verklig värld med verkliga människor. Verkliga människor som får betala dyrt för fundamentalt felaktiga doktriner och rekommendationer. Som alltid på marknaden är det de resurssvaga som i sista hand får stå för notan …

Den debatt vi de senaste veckorna sett segla upp om det finanspolitiska ramverket är välkommet. Den visar också på att det är hög tid att bryta mot mainstreamekonomernas konsensus kring penningpolitikens primat (och att en aktiv finanspolitik bara skulle vara aktuell bedriva när vi befinner oss i lägen med extremt låg ränta) när det gäller att lösa ekonomisk-politiska problem. Den ekonomiska politiken har länge nog haltat runt på ett ben. Vi är utrustade med två ben. För att röra oss stadigt  framåt är det bäst att nyttja dem båda..

Vi måste på allvar börja våga använda finanspolitiken. För att få fart på ekonomin behöver vi både en penningpolitisk och en finanspolitisk dynamo. Det är med andra ord hög tid att skrota förlegade och kontraproduktiva överskottsmål och utgiftstak!

Räntepolitikens ineffektivitet idag är mer än något annat ett kvitto på att den nyliberala åtstramningspolitiken nått vägs ände. Om den inte ersätts — och det snart — med en mer expansiv och jobbinriktad ekonomisk politik ser den svenska ekonomins framtid allt annat än ljus ut.

« Previous PageNext Page »

Blog at WordPress.com.
Entries and Comments feeds.