‘Modern’ economics — blah blah blah

23 May, 2017 at 16:37 | Posted in Statistics & Econometrics | 2 Comments

A key part of the solution to the identification problem that Lucas and Sargent (1979) seemed to offer was that mathematical deduction could pin down some parameters in a simultaneous system. But solving the identification problem means feeding facts with truth values that can be assessed, yet math cannot establish the truth value of a fact. Never has. Never will.

blah_blahIn practice, what math does is let macro-economists locate the FWUTVs [facts with unknown truth values] farther away from the discussion of identification … Relying on a micro-foundation lets an author say, “Assume A, assume B, …  blah blah blah …. And so we have proven that P is true. Then the model is identified.” …

Distributional assumptions about error terms are a good place to bury things because hardly anyone pays attention to them. Moreover, if a critic does see that this is the identifying assumption, how can she win an argument about the true expected value the level of aether? If the author can make up an imaginary variable, “because I say so” seems like a pretty convincing answer to any question about its properties.

Paul Romer

Yes, indeed, modern mainstream economics — and especially its mathematical-statistical operationalization in the form of econometrics — fails miserably over and over again. One reason why it does, is that the error term in the regression models used are thought of as representing the effect of the variables that were omitted from the models. The error term is somehow thought to be a ‘cover-all’ term representing omitted content in the model and necessary to include to ‘save’ the assumed deterministic relation between the other random variables included in the model. Error terms are usually assumed to be orthogonal (uncorrelated) to the explanatory variables. But since they are unobservable, they are also impossible to empirically test. And without justification of the orthogonality assumption, there is as a rule nothing to ensure identifiability.

In mainstream econometrics the error term is usually portrayed as representing the combined effect of the variables that are omitted from the model. What one does not say — in a way bordering on intellectual dishonesty — is that this assumption only works when (1) the combined effect is independent of each and every variable included in the model, and (2) the expectational value of the combined effect equals zero. And that is something almost never fulfilled in real world settings!

‘Modern’ mainstream economics is based on the belief that deductive-axiomatic modelling  is a sufficient guide to truth. That belief is, however, totally unfounded as long as no proofs are supplied for us to believe in the assumptions on which the model-based deductions and conclusions  build. ‘Mathiness’ masquerading as science is often used by mainstream economists to hide the problematic character of the assumptions used in their theories and models. But — without showing the model assumptions to be realistic and relevant, that kind of economics indeed, as Romer puts it, produces nothing but “blah blah blah.”

Advertisements

2 Comments

  1. Axioms, assumptions and definitions all must precede the modeling process. Then comes deductions and analysis. This is the scientific for developing a theory or hypothesis and due to our need to think in the human way we have no alternative but to adopt it. The rest of the past attempts are intuitive and in science are valueless. Of course if we are not being scientific then …………….

  2. How does the “Solow residual” fit into this classification? I thought it was the amount of productivity growth the model could not account for (thus in effect an “error term”) but it certainly doesn’t have an expectation of zero.


Sorry, the comment form is closed at this time.

Blog at WordPress.com.
Entries and comments feeds.