The pretense-of-knowledge syndrome in economics

15 March, 2014 at 16:15 | Posted in Economics | 6 Comments

What does concern me about my discipline … is that its current core — by which I mainly mean the so-called dynamic stochastic general equilibrium approach — has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one …

While it often makes sense to assume rational expectations for a limited application to isolate a particular mechanism that is distinct from the role of expectations formation, this assumption no longer makes sense once we assemble the whole model. Agents could be fully rational with respect to their local environments and everyday activities, but they are most probably nearly clueless with respect to the statistics about which current macroeconomic models expect them to have full information and rational information.

pretenceThis issue is not one that can be addressed by adding a parameter capturing a little bit more risk aversion about macro-economic, rather than local, phenomena. The reaction of human beings to the truly unknown is fundamentally different from the way they deal with the risks associated with a known situation and environment … In realistic, real-time settings, both economic agents and researchers have a very limited understanding of the mechanisms at work. This is an order-of-magnitude less knowledge than our core macroeconomic models currently assume, and hence it is highly likely that the optimal approximation paradigm is quite different from current workhorses, both for academic andpolicy work. In trying to add a degree of complexity to the current core models, by bringing in aspects of the periphery, we are simultaneously making the rationality assumptions behind that core approach less plausible …

The challenges are big, but macroeconomists can no longer continue playing internal games. The alternative of leaving all the important stuff to the “policy”-typ and informal commentators cannot be the right approach. I do not have the answer. But I suspect that whatever the solution ultimately is, we will accelerate our convergence to it, and reduce the damage we do along the transition, if we focus on reducing the extent of our pretense-of-knowledge syndrome.

Ricardo J. Caballero

A great article that also underlines — especially when it comes to forecasting and implementing economic policies  — that the future is inherently unknowable, and using statistics, econometrics, decision theory or game theory, does not in the least overcome this ontological fact.

It also further underlines how important it is in social sciences — and economics in particular — to incorporate Keynes’s far-reaching and incisive analysis of induction and evidential weight in his seminal A Treatise on Probability (1921).

treatprobAccording to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but “rational expectations.” Keynes rather thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief,” beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modeled by “modern” social sciences. And often we “simply do not know.”

How strange that social scientists and mainstream economists as a rule do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical “probability.” In the quest for measurable quantities one puts a blind eye to qualities and looks the other way.

So why do economists, companies and governments continue with the expensive, but obviously worthless, activity of trying to forecast/predict the future?

A couple of months ago yours truly was interviewed by a public radio journalist working on a series on Great Economic ThinkersWe were discussing the monumental failures of the predictions-and-forecasts-business. But — the journalist asked — if these cocksure economists with their “rigorous” and “precise” mathematical-statistical-econometric models are so wrong again and again — why do they persist wasting time on it?

In a discussion on uncertainty and the hopelessness of accurately modeling what will happen in the real world — in M. Szenberg’s Eminent Economists: Their Life Philosophies — Nobel laureate Kenneth Arrow comes up with what is probably the most plausible reason:

It is my view that most individuals underestimate the uncertainty of the world. This is almost as true of economists and other specialists as it is of the lay public. To me our knowledge of the way things work, in society or in nature, comes trailing clouds of vagueness … Experience during World War II as a weather forecaster added the news that the natural world as also unpredictable. cloudsAn incident illustrates both uncer-tainty and the unwilling-ness to entertain it. Some of my colleagues had the responsi-bility of preparing long-range weather forecasts, i.e., for the following month. The statisticians among us subjected these forecasts to verification and found they differed in no way from chance. The forecasters themselves were convinced and requested that the forecasts be discontinued. The reply read approximately like this: ‘The Commanding General is well aware that the forecasts are no good. However, he needs them for planning purposes.’



  1. I sense there is an air of defeatism or fatalism in the attempted understanding of predicting; decisions do have to be made. We should be aware of whether our thinking that the top of the leaderships of the public and private institutions are our smartest and best informed, is in fact the case. Philip Tetlock examines prediction in his 2005 ‘must-read’ Expert Political Judgment by taking up your skeptical position and giving it an empirical analysis. His findings distinguish better and worse expert predictors and he gives a detailed scaling of the abilities and their underlying presumptions. If you follow Critical Review, led by Jeffrey Friedman, you can find quarterly articles on these same problems many of which revolve around Hayek’s ideas. In Bruce Bueno de Mesquita’s, The Predictioneer’s Game, 2009, one of the issues is the belief basis for decisions which differentiates the skeptic from the critical analyst. It is important to recognize non-cognitive bases for decision-making including following the status quo, beliefs, political affiliations, religion, attitudes, etc from cognitive bases: evidence, reasons, causes and abiding by the principled obligations related to responsibility. The non-cognitive or affective bases of decision-making must be integrated into policy implementation, but policy evaluation should be strictly cognitive.

  2. “So why do economists, companies and governments continue with the expensive, but obviously worthless, activity of trying to forecast/predict the future?”

    They are trying to enforce reflexivity but they have not been very successful. It is similar to verbal intervention in the markets. Some agents believe the government and act accordingly. Please recall that all government forecasts are always optimistic. This is a game that has nothing to do with economics as every model has many parameters that can be fudged.

    • I was trying to give a sense that prediction as perceived by the lay person is much more complicated from the view of the analyst and from the view of the decision-maker. Generalizations do not help much and empirical analyses shed light. Please see the references in my first post.

  3. This may clarify the problem. I found this on Critical Sociology 1979 by Camfield, The American Upper Class and the Problem of Legitimacy,

    “The thesis of this paper is that while the Joint Council claims that “quality” economic education is its primary goal, in reality economic education is a method by which the American upper class attempts to legitimate the American system of monopoly capitalism. The perspective of economic education
    claims itself to be a “neutral” form of knowledge – similar to physics – designed to serve a general public interest in furthering “education.” But, as we will see in the following pages, the content of its program is oriented towards providing a theoretical justification for the character of the post-World War
    II American capitalist state and at the same time confronting emerging criticism and discontent about the very shape of polity and economy. The surges in the economic education movement have occurred at points in time when there has been a
    threat to the legitimacy of the monopoly capitalist state in the U.S.

    A simpler way of stating this is that the forms that typical economic education takes is irrelevant to the student’s activity. A key concept is ‘equilibrium’ which can only make sense from a particular contextualized perspective but generally is nonexistent as a phenomenon. Forbid that anyone starts asking the explanatory question, WHY

  4. I cannot say I much like the nihilism of “Nobody knows anything.” (William Goldman’s famous summary of the movie business.) As I said in a previous comment, I find the unwillingness of so many economists to predict and forecast, or to be held accountable for doing so, almost more disturbing than the failures to forecast correctly. If economists forecast, and are interested in the reasons for the failures of those forecasts, I’d think there was some potential for learning.

    Much of the difficulty neoclassical economics has with uncertainty, it seems to me, is attached to privileging of analysis over synthesis, and the insistence on modeling only within the confines of a deductive logic, and then insisting, as well, that the deductive model is descriptive — something inherently a priori analysis can never be. Even in physics, scientists know that their theoretical models are not descriptive, and invest enormous effort in operationalizing models to enable careful measurement and observation. In economics, too many economists — including the estimable Keynes — rely on their intuition to choose the “best” analytic model, producing a favored insight, without worrying sufficiently about getting down to cases and particulars.

    A theoretical physicist might calculate the theoretical limit of the heat efficiency of an engine design, and an engineer might use that analysis to devise an operational model, to use to measure the actual efficiency of a particular engine, and relate those measurements to particular features of the design or methods of operation. By contrasting analogy, the absurdity of advancing the Efficient Markets Hypothesis as an unqualified proposition about actual financial markets, even as the defects of those markets are revealed, simply cannot be exaggerated.

    We cannot expect to do without institutions, which allow us to make use of the knowledge we do have, as well as to cope with the inevitable consequences of the knowledge we do not have, . . . or painfully acquire in the school of experience. (All learning, in an important sense, is learning from error, which is to say, mistakes, and mistakes are costly. This cannot be entirely avoided.) One answer to Ricardo J. Caballero, is to go outside and look at actual institutions in operation and describe them as systems coping with the variability of the world and controlling processes imperfectly. To take the example of financial markets, the question should not be whether a financial market is efficient in some qualitative sense, but exactly how efficient it is, and what accounts for the degree of efficiency; to treat those institutions as an engineer might treat a heat engine. This will entail “predicting” its operation, in some important sense, though not necessarily like a seer. (Maybe, more like an Old Testament Prophet?)

  5. Dissertation topic: “Local Knowledge Syndrome and Donors-Promoted Public Sector Reforms in Developing Countries: Presenting The Dutch-NPT Program to Promote the Higher Education Sector Reform in Yemen”.

    Leiden University

    Leiden University

    Doctor of Philosophy (PhD), International Development Administration,

    2009 – 2014

    My study examines causes of the ineffectiveness of donor-promoted public sector reforms in developing countries. It focuses on the problems of incorporating local knowledge in the design and implementation of aid programs, which termed the “Local Knowledge Syndrome” (LKS). The LKS exists when aid policies and programs do not tailor to recipients’ unique context and needs, and rely on reform assumptions and formal models imported form select developed countries. It can also exist when aid policies focus exclusively on reforming formal organizations and institutions and neglect informal practices.

Sorry, the comment form is closed at this time.

Create a free website or blog at
Entries and comments feeds.