Jürgen Habermas and Hans Albert on the weaknesses of mainstream economics

11 Aug, 2022 at 13:41 | Posted in Economics | 5 Comments

On the Logic of the Social Sciences | Jurgen Habermas, Shierry Weber  Nicholsen, Jerry A. Stark | Paperback Octavo The weaknesses of social-scientific normativism are obvious. The basic assumptions refer to idealized action under pure maxims; no empirically substantive lawlike hypotheses can be derived from them. Either it is a question of analytic statements recast in deductive form or the conditions under which the hypotheses derived could be definitively falsified are excluded under ceteris paribus stipulations. Despite their reference to reality, the laws stated by pure economics have little, if any, information content. To the extent that theories of rational choice lay claim to empirical-analytic knowledge, they are open to the charge of Platonism (Modellplatonismus). Hans Albert has summarized these arguments: The central point is the confusion of logical presuppositions with empirical conditions. The maxims of action introduced are treated not as verifiable hypotheses but as assumptions about actions by economic subjects that are in principle possible. The theorist limits himself to formal deductions of implications in the unfounded expectation that he will nevertheless arrive at propositions with empirical content. Albert’s critique is directed primarily against tautological procedures and the immunizing role of qualifying or “alibi” formulas.

This critique of normative-analytic methods argues that general theories of rational action are achieved at too great a cost when they sacrifice empirically verifiable and descriptively meaningful information … Against the recent normativism of pure economics, Albert adduces the old viewpoint that an economic theory must make assumptions about the actions of subjects in their social roles … The system of exchange relationships is so little isolated from society as a whole that the social behavior of economic subjects cannot be comprehended independently of the institutional context, that is, the extra-economic motivational patterns: “Immunization against· the influence of so-called extra-economic factors leads to immunization against experience as such.”

Seen from a deductive-nomological perspective, typical economic models (M) usually consist of a theory (T) — a set of more or less general (typically universal) law-like hypotheses (H) — and a set of (typically spatio-temporal) auxiliary assumptions (A). The auxiliary assumptions give ‘boundary’ descriptions such that it is possible to deduce logically (meeting the standard of validity) a conclusion (explanandum) from the premises T & A. Using this kind of model game theorists are (portrayed as) trying to explain (predict) facts by subsuming them under T, given A. An obvious problem with the formal-logical requirements of what counts as H is the often severely restricted reach of the ‘law.’ In the worst case, it may not be applicable to any real, empirical, relevant, situation at all. And if A is not true, then M does not really explain (although it may predict) at all. Deductive arguments should be sound – valid and with true premises – so that we are assured of having true conclusions. Constructing game theoretical models assuming ‘common knowledge’ and ‘rational expectations,’ says nothing of situations where knowledge is ‘non-common’ and  expectations are ‘non-rational.’

Building theories and models that are ‘true’ in their own very limited ‘idealized’ domain is of limited value if we cannot supply bridges to the real world. ‘Laws’ that only apply in specific ‘idealized’ circumstances —  in ‘nomological machines’ — are not the stuff that real science is built of.

When confronted with the massive empirical refutations of almost all models they have set up, many game theorists react by saying that these refutations only hit A (the Lakatosian ‘protective belt’), and that by ‘successive approximations’ it is possible to make the models more readily testable and predictably accurate. Even if T & A1 do not have much of empirical content if by successive approximation we reach, say, T & A25, we are to believe that we can finally reach robust and true predictions and explanations.

Hans Albert’s ‘Model Platonism’ critique shows that there is a strong tendency for modelers to use the method of successive approximations as a kind of ‘immunization,’ taking for granted that there can never be any faults with the theory. Explanatory and predictive failures hinge solely on auxiliary assumptions. That the kind of theories and models used by game theorists should all be held non-defeasibly corroborated, seems, however — to say the least — rather unwarranted.

Retreating into looking upon models and theories as some kind of ‘conceptual exploration,’ and giving up any hopes whatsoever of relating theories and models to the real world is pure defeatism. Instead of trying to bridge the gap between models and the world, they simply decide to look the other way.

To yours truly, this kind of scientific defeatism is equivalent to surrendering our search for understanding and explaining the world we live in. It cannot be enough to prove or deduce things in a model world. If theories and models do not directly or indirectly tell us anything about the world we live in — then why should we waste any of our precious time on them?


  1. Lars,
    It is virtually impossible to argue against your line that formal neoclassical theory is devoid of any reality. But you don’t offer a concrete alternative.
    You have suggested at other times, that economists should study institutional arrangements etc.etc. So the economist is studying and using analogues drawn from history. But the problem is the economist cannot be secure that what applied in one time period in the past will also apply in the future. So, in effect, this approach is not viable either.
    As far as I see it, you have boxed yourself into a corner.
    Perhaps the answer is sortition.
    Luke Rhinehart’s (pen name for George Cockcroft) novel, the “Dice Man”, could become the economists new bible. (Some would say that the free financialized economies are run like casinos anyway so it might not be a radical move at all.)
    Policy options could be assigned to each dice number. And to include as many options as might be considered, more dice could be used (although some options, in that case would have a higher theoretical probability of being selected. This could be used to desired effect – those options considered more acceptable than others would be assigned to the more probable numbers.)
    The problem is that the protagonists in Rhinehart’s novel descended into a kind of madness from which it was almost impossible to return.
    Perhaps we are there already anyway.

  2. The economics profession not only does not adequately justify the use of rational choice, part of underlying philosophical foundation rooted in Utilitarianism, it does not attempt to justify it. It does not even attempt to justify its non-attempt to justify it.

    It is this lack of critical awareness that lies at the heart of its problems. Historians will look back at how the liberal order established in 1945 became undone, and certainly the mentality and practice of the economics profession will be part of the story. They will note its failures and lack of capability to have provided a decent policy brief to politicians including the good ones to recognise and deal with emerging weaknesses in capitalism that are undermining democracy. Had they given effective politicians like Blair, Clinton or Obama the right advice, things might have been different. Neo-liberal policies including uber pro-globalisation and deregulation policies are for sure fundamentally linked to Samuelson and neo-classical economics.

    Habermas is widely recognised within philosophy and more widely as one of the leading thinkers of our time. His opinions of the economics profession matter and you are doing an invaluable public service in assisting in its dispersion.

    • Not directly related to this thread, but a wide-ranging discussion of the causation behind the unravelling of the Minsk Accords.

      • One of the many interesting passages:

        “Before the war, Zelensky failed in everything. Poroshenko was actually more capable of resisting some of the international institutions’ demands—specifically the imf’s pressure for market prices on gas, which Ukrainian governments always tried to block because it was hugely unpopular—especially with older people, for whom the price increase would be a heavy blow, and who vote in large numbers. Zelensky also pushed through a land market reform, which has been a big question since Ukrainian independence and very unpopular; over 70 per cent of Ukrainians were against some of the clauses.”

        A very predictable policy by the IMF (which is very closely linked to the MIT Economics Department). It is a result of their indoctrination from micro-economic theory. Any type of collective bargaining system for example is an anathema to a neo-classical economist for the same reasons – because they believe that prices must in all cases be determined by the market mechanism. Wren Lewis for example on his blog writes:

        “The 1970s in the UK in particular represented a prolonged experiment in attempting to control inflation without imposing the costs of higher unemployment, and instead using a mixture of wage and price controls and deals between governments and trade unions. The result of this experiment was clear – it failed.”

        It’s true in many Anglo Saxon countries they have not worked well, but in Europe and Japan they have worked well at containing inflation for a very long time, even during the Stagflation era. The interesting question surely is why do they work in some places and not in others. This is getting close to the point Adorno and Habermas are making. The really interesting questions are ‘fenced off’.

Sorry, the comment form is closed at this time.

Blog at WordPress.com.
Entries and Comments feeds.