Economics journals — publishing lazy non-scientific work

27 October, 2015 at 11:00 | Posted in Economics | 1 Comment

In a new paper, Andrew Chang, an economist at the Federal Reserve and Phillip Li, an economist with the Office of the Comptroller of the Currency, describe their attempt to replicate 67 papers from 13 well-regarded economics journals …

unscientificTheir results? Just under half, 29 out of the remaining 59, of the papers could be qualitatively replicated (that is to say, their general findings held up, even if the authors did not arrive at the exact same quantitative result). For the other half whose results could not be replicated, the most common reason was “missing public data or code” …

H.D. Vinod, an economics professor at Fordham University … noted that … caution could be outweighed by the sheer amount of work it takes to clean up data files in order to make them reproducible.

“It’s human laziness,” he said. “There’s all this work involved in getting the data together” …

Bruce McCullough, said he thought the authors’ definition of what counted as replication – achieving the same qualitative, as opposed to quantitative, results – was far too generous. If a paper’s conclusions are correct, he argues, one should be able to arrive at the same numbers using the same data.

“What these journals produce is not science,” he said. “People should treat the numerical results as if they were produced by a random number generator.”

Anna Louie Sussman

Advertisements

1 Comment

  1. McCullough (plus Richard Anderson) is da man on the subject of “internal replication”. The only reason researchers are shifting the goalposts of what counts as replication is because of all the studies he’s participated in consistently demonstrating shockingly atrocious replication rates in econometrics.


Sorry, the comment form is closed at this time.

Create a free website or blog at WordPress.com.
Entries and comments feeds.