So, you think statistics is boring? Well, you’re wrong!

23 June, 2015 at 09:17 | Posted in Statistics & Econometrics | 1 Comment

 

The way forward — discard 90% of the data!

21 June, 2015 at 20:02 | Posted in Statistics & Econometrics | Leave a comment

Could it be better to discard 90% of the reported research? Surprisingly, the answer is yes to this statistical paradox. This paper has shown how publication selection can greatly distort the research record and its conventional summary statistics. Using both Monte Carlo simulations and actual research examples, we show how a simple estimator, which uses only 10 percent of the reported research reduces publication bias and improves efficiency over conventional summary statistics that use all the reported research.

gonefishing1The average of the most precise 10 percent, ‘Top10,’ of the reported estimates of a given empirical phenomenon is often better than conventional summary estimators because of its heavy reliance on the reported estimate’s precision (i.e., the inverse of the estimate’s standard error). When estimates are chosen, in part, for their statistical significance, studies cursed with imprecise estimates have to engage in more intense selection from among alternative statistical techniques, models, data sets, and measures to produce the larger estimate that statistical significance demands. Thus, imprecise estimates will contain larger biases.

Studies that have access to more data will tend to be more precise, and hence less biased. At the level of the original empirical research, the statistician’s motto, “the more data the better,” holds because more data typically produce more precise estimates. It is only at the meta-level of integrating, summarizing, and interpreting an entire area of empirical research (meta-analysis), where the removal of 90% of the data might actually improve our empirical knowledge. Even when the authors of these larger and more precise studies actively select for statistical significance in the desired direction, smaller significant estimates will tend to be reported. Thus, precise studies will, on average, be less biased and thereby possess greater scientific quality, ceteris paribus.

We hope that the statistical paradox identified in this paper refocuses the empirical sciences upon precision. Precision should be universally adopted as one criterion of research quality, regardless of other statistical outcomes.

T.D. Stanley, Stephen B. Jarrell, and Hristos Doucouliagos

Econometric alchemy

15 June, 2015 at 17:31 | Posted in Statistics & Econometrics | Leave a comment

Thus we have “econometric modelling”, that activity of matching an incorrect version of [the parameter matrix] to an inadequate representation of [the data generating process], using insufficient and inaccurate data.59524872 The resulting compromise can be awkward, or it can be a useful approximation which encompasses previous results, throws’ light on economic theory and is sufficiently constant for prediction, forecasting and perhaps even policy. Simply writing down an “economic theory”, manipulating it to a “condensed form” and “calibrating” the resulting parameters using a pseudo-sophisticated estimator based on poor data which the model does not adequately describe constitutes a recipe for disaster, not for simulating gold! Its only link with alchemy is self-deception.

David Hendry

Berkson’s fallacy (wonkish)

8 June, 2015 at 17:09 | Posted in Statistics & Econometrics | Leave a comment

 

‘Lump to live’ — on the use of categorical models

30 May, 2015 at 10:34 | Posted in Statistics & Econometrics | 2 Comments


Great lecture, Scott!

Causal inference in economics — an elementary introduction

27 May, 2015 at 20:20 | Posted in Statistics & Econometrics | Leave a comment

 

Lyapunov functions and systems attaining equilibria

11 May, 2015 at 20:31 | Posted in Statistics & Econometrics | 1 Comment

 

Hypothesis testing and the importance of checking distribution assumptions

10 May, 2015 at 16:59 | Posted in Statistics & Econometrics | Leave a comment

 

Random walks model thinking

9 May, 2015 at 09:48 | Posted in Statistics & Econometrics | 1 Comment

 

The limits of statistical inference

5 May, 2015 at 15:25 | Posted in Statistics & Econometrics | Leave a comment

causationCausality in social sciences — and economics — can never solely be a question of statistical inference. Causality entails more than predictability, and to really in depth explain social phenomena require theory. Analysis of variation — the foundation of all econometrics — can never in itself reveal how these variations are brought about. First when we are able to tie actions, processes or structures to the statistical relations detected, can we say that we are getting at relevant explanations of causation.

For more on these issues — see the chapter “Capturing causality in economics and the limits of statistical inference” in my On the use and misuse of theories and models in economics.

Next Page »

Create a free website or blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.