The tools economists use

25 Apr, 2021 at 09:55 | Posted in Economics | 4 Comments

In their quest for statistical “identification” of a causal effect, economists often have to resort to techniques that answer either a narrower or a somewhat different version of the question that motivated the research.

Policy Methods Toolbox - Observatory of Public Sector Innovation  Observatory of Public Sector InnovationResults from randomized social experiments carried out in particular regions of, say, India or Kenya may not apply to other regions or countries. A research design exploiting variation across space may not yield the correct answer to a question that is essentially about changes over time …

Economists’ research can rarely substitute for more complete works of synthesis, which consider a multitude of causes, weigh likely effects, and address spatial and temporal variation of causal mechanisms. Work of this kind is more likely to be undertaken by historians and non-quantitatively oriented social scientists …

Economists would not even know where to start without the work of historians, ethnographers, and other social scientists who provide rich narratives of phenomena and hypothesize about possible causes, but do not claim causal certainty.

Economists can be justifiably proud of the power of their statistical and analytical methods. But they need to be more self-conscious about these tools’ limitations. Ultimately, our understanding of the social world is enriched by both styles of research. Economists and other scholars should embrace the diversity of their approaches instead of dismissing or taking umbrage at work done in adjacent disciplines.

Dani Rodrik

As Rodrik notes, ‘ideally controlled experiments’ tell us with certainty what causes what effects — but only given the right ‘closures.’ Making appropriate extrapolations from (ideal, accidental, natural or quasi) experiments to different settings, populations or target systems, is not easy. Causes deduced in an experimental setting still have to show that they come with an export-warrant to their target populations.

the-right-toolThe almost religious belief with which its propagators — like 2019’s ‘Nobel prize’ winners Duflo, Banerjee and Kremer — portray it, cannot hide the fact that randomized controlled trials, RCTs, cannot be taken for granted to give generalisable results. That something works somewhere is no warranty for us to believe it to work for us here or even that it works generally.

Believing there is only one really good evidence-based method on the market — and that randomisation is the only way to achieve scientific validity — blinds people to searching for and using other methods that in many contexts are better. Insisting on using only one tool often means using the wrong tool.

‘Randomistas’ like Duflo et consortes think that economics should be based on evidence from randomised experiments and field studies. They want to give up on ‘big ideas’ like political economy and institutional reform and instead go for solving more manageable problems the way plumbers do. But that modern time ‘marginalist’ approach sure can’t be the right way to move economics forward and make it a relevant and realist science. A plumber can fix minor leaks in your system, but if the whole system is rotten, something more than good old fashion plumbing is needed. The big social and economic problems we face today is not going to be solved by plumbers performing RCTs.


  1. Duflo seems to be claiming that an export licence is warranted by “an understanding what might be the entire shape or contour” of a problem. Surely this warrants an export licence? If not, please give an example of a warranted export licence.
    Remember that we are here concerned with INDUCTION, which is defined as
    “the inference of a general law from particular instances”. Induction can never achieve certain conclusions, only probable conclusions. How many sigma confidence is required for an export licence?
    Induction is quite different to DEDUCTION, which is defined as “the inference of particular instances by reference to a general law or principle”. Are you claiming that export licences can be deduced with certainty by a rigorous chain of logic? If so, please give an example.
    Regarding solutions to poverty, are you proposing the continuation of progressive democratic changes such as those achieved in much of post-WW2 Europe, or something far more radical?

    • “an understanding what might be the entire shape or contour” of a problem. Surely this warrants an export licence?

      So what “an understanding of what might be” true, warrants an export license? In the end, if the assumptions aren’t met, the model is pretty useless

  2. Perhaps Ester Duflo’s advocacy of RCT’s is not “almost religious belief” as alleged by Prof. Syll.
    In a recent interview she explains:
    “I’ve found to respond to a frequent critique of RCTs, which is, “It’s nice that you get one result somewhere, but how do you know it can be generalized to other places?” The truth is that without a conceptual frame, I do not.
    Likewise, even though for your entire life, you’ve seen the sun rise on the same side of your home, without a framework, you have no idea whether it’s going to happen again tomorrow. There’s nothing new there in that philosophy, in neither the question nor the answer. Any advance is a combination of empirical findings and a frame to interpret.
    The idea of the pointillist painting is, imagine a painting by Seurat. It’s literally made of dots, and each of these dots on its own is perfectly nice, but it doesn’t generalize to anything. But if you step back and accumulate all these dots, you see the entire painting of, say, a family on the bank of the Seine having a picnic.
    Suppose you’re trying to assemble a jigsaw puzzle of that Seurat painting. Just by looking at the rest of the painting, you sort of know what goes next. You have a prediction about where a given piece fits. You might find that your piece doesn’t fit. It might be wrong. It’s not what you expected. But the frame, the painting, gives you good guidance for what you might expect.
    That’s how progress happens. The caricature is that you try one small experiment in one place, and then you can take the result to the entire world. That’s not it. The way it actually works is: Do your small experiment; get some findings that are interesting. They might contradict or confirm the theory that you started from, but they give you fodder for the next experiment, and so on and so forth, until you have an understanding of what might be the entire shape or contour of that problem.“

    • I do appreciate that Duflo here is more modest than randomistas usually are when they campaign for the use of randomisation experiments. That’s good — but causes deduced in an experimental setting still have to show that they come with an export-warrant to the target population/system. The causal background assumptions made have to be justified, and without licenses to export, the value of ‘rigorous’ and ‘precise’ methods — and ‘on-average-knowledge’ — is despairingly small.

      Apart from the methodological problems, I do think there is also a rather disturbing kind of scientific naïveté in the Duflo approach to combatting poverty. The way she presents the whole endeavour smacks of not so little ‘scientism’ where fighting poverty becomes a question of applying ‘objective’ quantitative ‘techniques.’ But that can’t be the right way to fight poverty! Fighting poverty and inequality is basically a question of changing the structure and institutions of our economies and societies.

Sorry, the comment form is closed at this time.

Blog at
Entries and Comments feeds.