January 20, 2017 — a date that will live in infamy

19 January, 2017 at 18:26 | Posted in Politics & Society | 5 Comments

How do you grieve for a nation? I don’t know.

But one thing I do know is that January 20 will be one of the saddest days I and all my American friends have ever experienced.

That a country that has given us presidents like George Washington, Thomas Jefferson, Abraham Lincoln, and Franklin D. Roosevelt, is going to be run by a witless clown like Donald Trump is an absolute disgrace.

don

Neoliberal a(u)ction

19 January, 2017 at 11:06 | Posted in Politics & Society | 5 Comments

 

Economic models are getting more and more sophisticated — and totally useless

18 January, 2017 at 18:27 | Posted in Economics | Leave a comment

 

Those of us in the economics community who are impolite enough to dare question the preferred methods and models applied in mainstream economics, are as a rule met with disapproval. But although people seem to get very agitated and upset by the critique — just read the commentaries on this blog if you don’t believe me — defenders of “received theory” always say that the critique is “nothing new”, that they have always been “well aware” of the problems, and so on, and so on.

So, for the benefit of all mindless practitioners of mainstream economic modeling — who defend mainstream economics with arguments like “the speed with which macro has put finance at the center of its theories of the business cycle has been nothing less than stunning,” and re the patently ridiculous representative-agent modeling, maintain that there “have been efforts to put heterogeneity into big DSGE-type models” but that these models “didn’t get quite as far, because this kind of thing is very technically difficult to model,” and as for rational expectations admit that “so far, macroeconomists are still very timid about abandoning this pillar of the Lucas/Prescott Revolution,” but that “there’s no clear alternative” — and who don’t want to be disturbed in their doings, eminent mathematical statistician David Freedman has put together a very practical list of vacuous responses to criticism that can be freely used to save your peace of mind:

We know all that. Nothing is perfect … The assumptions are reasonable. The assumptions don’t matter. The assumptios are conservative. You can’t prove the assumptions are wrong. The biases will cancel. We can model the biases. We’re only doing what evereybody else does. Now we use more sophisticated techniques. If we don’t do it, someone else will. What would you do? The decision-maker has to be better off with us than without us … The models aren’t totally useless. You have to do the best you can with the data. You have to make assumptions in order to make progress. You have to give the models the benefit of the doubt. Where’s the harm?

Keynes’ critique of econometrics — as valid today as it was in 1939

17 January, 2017 at 16:40 | Posted in Statistics & Econometrics | 3 Comments

Renowned ‘error-statistician’ Aris Spanos maintains — in a comment on this blog a couple of weeks ago — that Keynes’ critique of econometrics and the reliability of inferences made when it is applied, “have been addressed or answered.”

4388529One could, of course, say that, but the valuation of the statement hinges completely on what we mean by a question or critique being ‘addressed’ or ‘answered’. As I will argue below, Keynes’ critique is still valid and unanswered in the sense that the problems he pointed at are still with us today and ‘unsolved.’ Ignoring them — the most common practice among applied econometricians — is not to solve them.

To apply statistical and mathematical methods to the real-world economy, the econometrician has to make some quite strong assumption. In a review of Tinbergen’s econometric work — published in The Economic Journal in 1939 — Keynes gave a comprehensive critique of Tinbergen’s work, focussing on the limiting and unreal character of the assumptions that econometric analyses build on:

Completeness: Where Tinbergen attempts to specify and quantify which different factors influence the business cycle, Keynes maintains there has to be a complete list of all the relevant factors to avoid misspecification and spurious causal claims. Usually this problem is ‘solved’ by econometricians assuming that they somehow have a ‘correct’ model specification. Keynes is, to put it mildly, unconvinced:

istheseptuagintaIt will be remembered that the seventy translators of the Septuagint were shut up in seventy separate rooms with the Hebrew text and brought out with them, when they emerged, seventy identical translations. Would the same miracle be vouchsafed if seventy multiple correlators were shut up with the same statistical material? And anyhow, I suppose, if each had a different economist perched on his a priori, that would make a difference to the outcome.

J M Keynes

Homogeneity: To make inductive inferences possible — and being able to apply econometrics — the system we try to analyse has to have a large degree of ‘homogeneity.’ According to Keynes most social and economic systems — especially from the perspective of real historical time — lack that ‘homogeneity.’ As he had argued already in Treatise on Probability (ch. 22), it wasn’t always possible to take repeated samples from a fixed population when we were analysing real-world economies. In many cases there simply are no reasons at all to assume the samples to be homogenous. Lack of ‘homogeneity’ makes the principle of ‘limited independent variety’ non-applicable, and hence makes inductive inferences, strictly seen, impossible since one its  fundamental logical premisses are not satisfied. Without “much repetition and uniformity in our experience” there is no justification for placing “great confidence” in our inductions (TP ch. 8).

And then, of course, there is also the ‘reverse’ variability problem of non-excitation: factors that do not change significantly during the period analysed, can still very well be extremely important causal factors.

Stability: Tinbergen assumes there is a stable spatio-temporal relationship between the variables his econometric models analyze. But as Keynes had argued already in his Treatise on Probability it was not really possible to make inductive generalisations based on correlations in one sample. As later studies of ‘regime shifts’ and ‘structural breaks’ have shown us, it is exceedingly difficult to find and establish the existence of stable econometric parameters for anything but rather short time series.

Measurability: Tinbergen’s model assumes that all relevant factors are measurable. Keynes questions if it is possible to adequately quantify and measure things like expectations and political and psychological factors. And more than anything, he questioned — both on epistemological and ontological grounds — that it was always and everywhere possible to measure real-world uncertainty with the help of probabilistic risk measures. Thinking otherwise can, as Keynes wrote, “only lead to error and delusion.”

Independence: Tinbergen assumes that the variables he treats are independent (still a standard assumption in econometrics). Keynes argues that in such a complex, organic and evolutionary system as an economy, independence is  a deeply unrealistic assumption to make. Building econometric models from that kind of simplistic and unrealistic assumptions risk to produce nothing but spurious correlations and causalities. Real-world economies are organic systems for which the statistical methods used in econometrics are ill-suited, or even, strictly seen, inapplicable. Mechanical probabilistic models have little leverage when applied to non-atomic evolving organic systems — such as economies.

Building econometric models can’t be a goal in itself. Good econometric models are means that make it possible for us to infer things about the real-world systems they ‘represent.’ If we can’t show that the mechanisms or causes that we isolate and handle in our econometric models are ‘exportable’ to the real-world, they are of limited value to our understanding, explanations or predictions of real-world economic systems.

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be much less simple than the bare principle of uniformity. They appear to assume something much more like what mathematicians call the principle of the superposition of small effects, or, as I prefer to call it, in this connection, the atomic character of natural law. 3The system of the material universe must consist, if this kind of assumption is warranted, of bodies which we may term (without any implication as to their size being conveyed thereby) legal atoms, such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state. We do not have an invariable relation between particular bodies, but nevertheless each has on the others its own separate and invariable effect, which does not change with changing circumstances, although, of course, the total effect may be changed to almost any extent if all the other accompanying causes are different. Each atom can, according to this theory, be treated as a separate cause and does not enter into different organic combinations in each of which it is regulated by different laws …

The scientist wishes, in fact, to assume that the occurrence of a phenomenon which has appeared as part of a more complex phenomenon, may be some reason for expecting it to be associated on another occasion with part of the same complex. Yet if different wholes were subject to laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts. Given, on the other hand, a number of legally atomic units and the laws connecting them, it would be possible to deduce their effects pro tanto without an exhaustive knowledge of all the coexisting circumstances.

Linearity: To make his models tractable, Tinbergen assumes the relationships between the variables he study to be linear. This is still standard procedure today, but as as Keynes writes:

It is a very drastic and usually improbable postulate to suppose that all economic forces are of this character, producing independent changes in the phenomenon under investigation which are directly proportional to the changes in themselves; indeed, it is ridiculous.

To Keynes it was a ‘fallacy of reification’ to assume that all quantities are additive (an assumption closely linked to independence and linearity).

2014+22keynes%20illo2The unpopularity of the principle of organic unities shows very clearly how great is the danger of the assumption of unproved additive formulas. The fallacy, of which ignorance of organic unity is a particular instance, may perhaps be mathematically represented thus: suppose f(x) is the goodness of x and f(y) is the goodness of y. It is then assumed that the goodness of x and y together is f(x) + f(y) when it is clearly f(x + y) and only in special cases will it be true that f(x + y) = f(x) + f(y). It is plain that it is never legitimate to assume this property in the case of any given function without proof.

J. M. Keynes “Ethics in Relation to Conduct” (1903)

And as even one of the founding fathers of modern econometrics — Trygve Haavelmo — wrote:

What is the use of testing, say, the significance of regression coefficients, when maybe, the whole assumption of the linear regression equation is wrong?

Real-world social systems are usually not governed by stable causal mechanisms or capacities. The kinds of ‘laws’ and relations that econometrics has established, are laws and relations about entities in models that presuppose causal mechanisms and variables — and the relationship between them — being linear, additive, homogenous, stable, invariant and atomistic. But — when causal mechanisms operate in the real world they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. Since statisticians and econometricians — as far as I can see — haven’t been able to convincingly warrant their assumptions of homogeneity, stability, invariance, independence, additivity as being ontologically isomorphic to real-world economic systems, Keynes’ critique is still valid . As long as — as Keynes writes in a letter to Frisch in 1935 — “nothing emerges at the end which has not been introduced expressively or tacitly at the beginning,” I remain doubtful of the scientific aspirations of econometrics.

In his critique of Tinbergen, Keynes points us to the fundamental logical, epistemological and ontological problems of applying statistical methods to a basically unpredictable, uncertain, complex, unstable, interdependent, and ever-changing social reality. Methods designed to analyse repeated sampling in controlled experiments under fixed conditions are not easily extended to an organic and non-atomistic world where time and history play decisive roles.

Econometric modeling should never be a substitute for thinking. From that perspective it is really depressing to see how much of Keynes’ critique of the pioneering econometrics in the 1930s-1940s is still relevant today.

The general line you take is interesting and useful. It is, of course, not exactly comparable with mine. I was raising the logical difficulties. You say in effect that, if one was to take these seriously, one would give up the ghost in the first lap, but that the method, used judiciously as an aid to more theoretical enquiries and as a means of suggesting possibilities and probabilities rather than anything else, taken with enough grains of salt and applied with superlative common sense, won’t do much harm. I should quite agree with that. That is how the method ought to be used.

Keynes, letter to E.J. Broster, December 19, 1939

Calvo pricing — a ‘New Keynesian’ fairytale

16 January, 2017 at 18:50 | Posted in Economics | Leave a comment

pinnocThus your standard New Keynesian model will use Calvo pricing and model the current inflation rate as tightly coupled to the present value of expected future output gaps. Is this a requirement anyone really wants to put on the model intended to help us understand the world that actually exists out there? Thus your standard New Keynesian model will calculate the expected path of consumption as the solution to some Euler equation plus an intertemporal budget constraint, with current wealth and the projected real interest rate path as the only factors that matter. This is fine if you want to demonstrate that the model can produce macroeconomic pathologies. But is it a not-stupid thing to do if you want your model to fit reality?

I remember attending the first lecture in Tom Sargent’s evening macroeconomics class back when I was in undergraduate: very smart man from whom I have learned the enormous amount, and well deserving his Nobel Prize. But…

He said … we were going to build a rigorous, micro founded model of the demand for money: We would assume that everyone lived for two periods, worked in the first period when they were young and sold what they produced to the old, held money as they aged, and then when they were old use their money to buy the goods newly produced by the new generation of young. Tom called this “microfoundations” and thought it gave powerful insights into the demand for money that you could not get from money-in-the-utility-function models.

I thought that it was a just-so story, and that whatever insights it purchased for you were probably not things you really wanted to buy. I thought it was dangerous to presume that you understood something because you had “microfoundations” when those microfoundations were wrong. After all, Ptolemaic astronomy had microfoundations: Mercury moved more rapidly than Saturn because the Angel of Mercury left his wings more rapidly than the Angel of Saturn and because Mercury was lighter than Saturn…

Brad DeLong

Brad DeLong is of course absolutely right here, and one could only wish that other ‘New Keynesian’ macroeconomists would take a similar critical approach to their own modeling endeavours …

On the non-validity of incremental validity

16 January, 2017 at 17:04 | Posted in Statistics & Econometrics | 1 Comment

valA common goal of statistical analysis in the social sciences is to draw inferences about the relative contributions of different variables to some outcome variable. When regressing academic performance, political affiliation, or vocabulary growth on other variables, researchers often wish to determine which variables matter to the prediction and which do not—typically by considering whether each variable’s contribution remains statistically significant after statistically controlling for other predictors. When a predictor variable in a multiple regression has a coefficient that differs significantly from zero, researchers typically conclude that the variable makes a “unique” contribution to the outcome. And because measured variables are typically viewed as proxies for latent constructs of substantive interest—for example, two cognitive ability measures might be taken to index spatial versus verbal ability—it is natural to generalize the operational conclusion to the latent variable level; that is, to conclude that the latent construct measured by a given predictor variable itself has incremental validity in predicting the outcome, over and above other latent constructs that were examined.

Incremental validity claims pervade the social and biomedical sciences. In some fields, these claims are often explicit … More commonly, however, incremental validity claims are implicit—as when researchers claim that they have statistically “controlled” or “adjusted” for putative confounds—a practice that is exceedingly common in fields ranging from epidemiology to econometrics to behavioral neuroscience … The sheer ubiquity of such appeals might well give one the impression that such claims are unobjectionable, and if anything, represent a foundational tool for drawing meaningful scientific inferences.

Unfortunately, incremental validity claims can be deeply problematic. As we demonstrate below, even small amounts of error in measured predictor variables can result in extremely poorly calibrated Type 1 error probabilities. This basic problem has been discussed in a number of literatures—most extensively, in epidemiology and biostatistics, where concerns about incremental validity claims are often discussed under the heading of residual confounding, but also in fields ranging from psychology to education to econometrics. The common thread is that measurement unreliability and model misspecification will often have a deleterious and large effect on parameter estimates (and associated error rates) when covariates are entered into regression-based model. Consequently, under realistic assumptions, it can be shown that a large proportion of incremental validity claims in many disciplines are likely to be false …

In any given analysis, there is a simple fact of the matter as to whether or not the unique contribution of one or more variables in a regression is statistically significant when controlling for other variables; what room is there for inferential error? Trouble arises, however, when researchers behave as if statistical conclusions obtained at the level of observed measures can be automatically generalized to the level of latent constructs — a near-ubiquitous move, given that most scientists are not interested in prediction purely for prediction’s sake, and typically choose their measures precisely so as to stand in for latent constructs of interest. That is, researchers typically do not care to show that, say, school vouchers are associated with improved academic performance after controlling for a specific survey item asking about respondents’ income bracket; rather, the goal is to show that the vouchers may improve performance after accounting for the general construct of income (or, more generally, socioeconomic status).

Jacob Westfall & Tal Yarkoni

Lindeberg-Levy CLT (student stuff)

15 January, 2017 at 19:47 | Posted in Statistics & Econometrics | Leave a comment

 

A philosophical look at economics

15 January, 2017 at 13:57 | Posted in Economics | 2 Comments

pe_picOwing to the elegance of explanations in natural science, scientists in other disciplines are likely to be tempted to emulate this success. While that is a good thing in the sense of striving to observe the four criteria or coherence, correspondence, practicality and economy, it is not a good thing if scientists doing life or social science become unmindful of the limitations imposed by the subject matter of their discipline. The result will be application of inappropriate methodology and exaggerated claims.

Science is generally divided into natural, life, and social science based on subject matter. Science in general aims at causal understanding of a universe in which natural, life and social sciences are aspects of various phenomena.

Consilience is also a requirement in doing science. Scientific theories are expected to corroborate other and not contradict each other in that science aims at a general explanation of “reality.”

As a philosopher looking at economics as an amateur, it appears to me that many economists are careless about applying the above criteria and therefore overstate their claims and likely overestimate their knowledge. as being scientific instead of speculative, and objective rather than interested.

This is even before getting into measurement and historical issues. There really needs to be more attention paid to philosophy of science, philosophy of social science in economics and much more work is needed in philosophy (foundations) of economics, which is underdeveloped since so few people have contributed to it. Indeed, a lot of what passes for philosophy of economics now is mostly ideology. Yet, conventional economists claim that methodological questions are settled. NOT!

Tom Hickey

Econometric fundamentalism

15 January, 2017 at 09:23 | Posted in Statistics & Econometrics | 2 Comments

The wide conviction of the superiority of the methods of the science has converted the econometric community largely to a group of fundamentalist guards of mathematical rigour. It is often the case that mathemical rigour is held as the dominant goal and the criterion for research topic choice as well as research evaluation, so much so that the relevance of the research to business cycles is reduced to empirical illustrations. To that extent, probabilistic formalization has trapped econometric business cycle research in the pursuit of means at the expense of ends.

Economic_cycle.svg

Once the formalization attempts have gone significantly astray from what is needed for analysing and forecasting the multi-faceted characteristics of business cycles, the research community should hopefully make appropriate ‘error corrections’ of its overestimation of the power of a priori postulated models as well as its underestimation of the importance of the historical approach, or the ‘art’ dimension of business cycle research.

Duo Qin A History of Econometrics (OUP 2013)

And one day there will be no more frontier (personal)

14 January, 2017 at 15:10 | Posted in Varia | 1 Comment


In loving memory of my brother, Peter ‘Uncas’ Pålsson.

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.