Regression to the mean

26 July, 2015 at 14:21 | Posted in Statistics & Econometrics | Leave a comment

I had one of the most satisfying eureka experiences of my career while teaching flight instructors … about the psychology of effective training. I was telling them about an important principle of skill training: rewards for improved performance work better than punishment of mistakes…

THINKING-FAST-AND-SLOWWhen I finished my enthusiastic speech, one of the most seasoned instructors in the group raised his hand and made a short speech of his own. He began by conceding that rewarding improved performance might be good for the birds, but he denied that it was optimal for flight cadets. This is what he said: “On many occasions I have praised flight the next time thy try the same maneuver they usually do worse. On the other hand, I have often screamed into a cadet’s earphone for bad execution, and in general he does better on his next try. So please don’t tell us that reward works and punishment does not, because the opposite is the case” …

What he had observed is known as regression to the mean, which in that case was due to random fluctuations in the quality of performance. Naturally, he praised only a cadet whose performance was far better than average. But the cadet was probably just lucky on that particular attempt and therefore likely to deteriorate regardless of whether or not he was praised. Similarly, the instructor would shout into a cadet’s earphones only when the cadet’s performance was unusually bad and therefore likely to improve regardless of what the instructor did. The instructor had attached a causal interpretation to the inevitable fluctuations of a random process …

I had stumbled onto a significant fact of the human condition: the feedback to which life exposes us is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty …

It took Francis Galton several years to figure out that correlation and regression are not two concepts – they are different perspectives on the same concept: whenever the correlation between two scores is imperfect, there will be regression to the mean …

Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause.

So, you think statistics is boring? Well, you’re wrong!

23 June, 2015 at 09:17 | Posted in Statistics & Econometrics | 1 Comment

 

The way forward — discard 90% of the data!

21 June, 2015 at 20:02 | Posted in Statistics & Econometrics | Leave a comment

Could it be better to discard 90% of the reported research? Surprisingly, the answer is yes to this statistical paradox. This paper has shown how publication selection can greatly distort the research record and its conventional summary statistics. Using both Monte Carlo simulations and actual research examples, we show how a simple estimator, which uses only 10 percent of the reported research reduces publication bias and improves efficiency over conventional summary statistics that use all the reported research.

gonefishing1The average of the most precise 10 percent, ‘Top10,’ of the reported estimates of a given empirical phenomenon is often better than conventional summary estimators because of its heavy reliance on the reported estimate’s precision (i.e., the inverse of the estimate’s standard error). When estimates are chosen, in part, for their statistical significance, studies cursed with imprecise estimates have to engage in more intense selection from among alternative statistical techniques, models, data sets, and measures to produce the larger estimate that statistical significance demands. Thus, imprecise estimates will contain larger biases.

Studies that have access to more data will tend to be more precise, and hence less biased. At the level of the original empirical research, the statistician’s motto, “the more data the better,” holds because more data typically produce more precise estimates. It is only at the meta-level of integrating, summarizing, and interpreting an entire area of empirical research (meta-analysis), where the removal of 90% of the data might actually improve our empirical knowledge. Even when the authors of these larger and more precise studies actively select for statistical significance in the desired direction, smaller significant estimates will tend to be reported. Thus, precise studies will, on average, be less biased and thereby possess greater scientific quality, ceteris paribus.

We hope that the statistical paradox identified in this paper refocuses the empirical sciences upon precision. Precision should be universally adopted as one criterion of research quality, regardless of other statistical outcomes.

T.D. Stanley, Stephen B. Jarrell, and Hristos Doucouliagos

Econometric alchemy

15 June, 2015 at 17:31 | Posted in Statistics & Econometrics | Leave a comment

Thus we have “econometric modelling”, that activity of matching an incorrect version of [the parameter matrix] to an inadequate representation of [the data generating process], using insufficient and inaccurate data.59524872 The resulting compromise can be awkward, or it can be a useful approximation which encompasses previous results, throws’ light on economic theory and is sufficiently constant for prediction, forecasting and perhaps even policy. Simply writing down an “economic theory”, manipulating it to a “condensed form” and “calibrating” the resulting parameters using a pseudo-sophisticated estimator based on poor data which the model does not adequately describe constitutes a recipe for disaster, not for simulating gold! Its only link with alchemy is self-deception.

David Hendry

Berkson’s fallacy (wonkish)

8 June, 2015 at 17:09 | Posted in Statistics & Econometrics | Leave a comment

 

‘Lump to live’ — on the use of categorical models

30 May, 2015 at 10:34 | Posted in Statistics & Econometrics | 2 Comments


Great lecture, Scott!

Causal inference in economics — an elementary introduction

27 May, 2015 at 20:20 | Posted in Statistics & Econometrics | Leave a comment

 

Lyapunov functions and systems attaining equilibria

11 May, 2015 at 20:31 | Posted in Statistics & Econometrics | 1 Comment

 

Hypothesis testing and the importance of checking distribution assumptions

10 May, 2015 at 16:59 | Posted in Statistics & Econometrics | Leave a comment

 

Random walks model thinking

9 May, 2015 at 09:48 | Posted in Statistics & Econometrics | 1 Comment

 

Next Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.