Was ist Kausalität?
31 Jan, 2022 at 16:30 | Posted in Theory of Science & Methodology | Comments Off on Was ist Kausalität?.
In search of identification — instrumental variables
30 Jan, 2022 at 10:21 | Posted in Statistics & Econometrics | Comments Off on In search of identification — instrumental variables
We need relevance and validity. How realistic is validity, anyway? We ideally want our instrument to behave just like randomization in an experiment. But in the real world, how likely is that to actually happen? Or, if it’s an IV that requires control variables to be valid, how confident can we be that the controls really do everything we need them to?
In the long-ago times, researchers were happy to use instruments without thinking too hard about validity. If you go back to the 1970s or 1980s you can find people using things like parental education as an instrument for your own (surely your parents’ education can’t possibly affect your outcomes except through your own education!). It was the wild west out there…
But these days, go to any seminar where an instrumental variables paper is presented and you’ll hear no end of worries and arguments about whether the instrument is valid. And as time goes on, it seems like people have gotten more and more difficult to convince when it comes to validity. This focus on validity is good, but sometimes comes at the expense of thinking about other IV considerations, like monotonicity (we’ll get there) or even basic stuff like how good the data is.
There’s good reason to be concerned! Not only is it hard to justify that there exists a variable strongly related to treatment that somehow isn’t at all related to all the sources of hard-to-control-for back doors that the treatment had in the first place, we also have plenty of history of instruments that we thought sounded pretty good that turned out not to work so well.
Nick Huntington-Klein’s new book on how to use observational data to make causal inferences is superbly accessible. Highly recommended reading for anyone interested in causal inference in economics and social science!
Anwar Shaikh on alternatives to mainstream economics
30 Jan, 2022 at 09:21 | Posted in Economics | 3 Comments.
Does economics — really — explain everything?
29 Jan, 2022 at 17:49 | Posted in Economics | 1 Comment.
Fred Lee on ‘non-knowledge’ mainstream economics
28 Jan, 2022 at 16:38 | Posted in Economics | 1 Comment
The methodological underpinning of neoclassical microeconomics is open to criticisms. The methodological approach of neoclassical economics is based on a pre-vision of supply and demand and/or a Walrasian general equilibrium all combined with scarcity and constrained maximization. Accepting this vision as a matter of faith, neoclassical economists construct axiomatic-based arguments via a deductivist methodology (with or without the use of mathematics) to articulate this pre-vision. There is no attempt to establish that the pre-vision has any connection to or is grounded in the actual capitalist economy it purports to explain. Hence the method of constructing theory is not tied to or informed by the real world, which means that the axioms qua assumptions used are not chosen because of their realism or some other way grounded in reality but solely because they contribute to articulating the pre-vision. Therefore with a methodology unconcerned with the real world, the theories derived therefrom are theoretically vacuous and hence not really explanations. They are in fact non-knowledge. Consequently, the methodology of neoclassical economics is not just wrong, it is also misleading in that it cannot inherently provide any understanding of how the real works or even predict outcomes in the real world.
Fred was together with Nai Pew Ong, Bob Pollin and Axel Leijonhufvud one of those who made a visit to University of California such a great experience for a young economics scholarship holder back at the beginning of the 1980s. Yours truly especially remember our long and intense discussions on Sraffa and Neo-Ricardianism. It is now more than seven years since Fred passed away. I truly miss this open-minded and good-hearted heterodox economist.
Songs from a world apart
28 Jan, 2022 at 14:20 | Posted in Varia | Comments Off on Songs from a world apart.
The Holy Grail of Science
28 Jan, 2022 at 14:01 | Posted in Theory of Science & Methodology | 1 CommentTraditionally, philosophers have focused mostly on the logical template of inference. The paradigm-case has been deductive inference, which is topic-neutral and context-insensitive. The study of deductive rules has engendered the search for the Holy Grail: syntactic and topic-neutral accounts of all prima facie reasonable inferential rules. The search has hoped to find rules that are transparent and algorithmic, and whose following will just be a matter of grasping their logical form. Part of the search for the Holy Grail has been to show that the so-called scientific method can be formalised in a topic-neutral way. We are all familiar with Carnap’s inductive logic, or Popper’s deductivism or the Bayesian account of scientific method.
There is no Holy Grail to be found. There are many reasons for this pessimistic conclusion. First, it is questionable that deductive rules are rules of inference. Second, deductive logic is about updating one’s belief corpus in a consistent manner and not about what one has reasons to believe simpliciter. Third, as Duhem was the first to note, the so-called scientific method is far from algorithmic and logically transparent. Fourth, all attempts to advance coherent and counterexample-free abstract accounts of scientific method have failed. All competing accounts seem to capture some facets of scientific method, but none can tell the full story. Fifth, though the new Dogma, Bayesianism, aims to offer a logical template (Bayes’s theorem plus conditionalisation on the evidence) that captures the essential features of non-deductive inference, it is betrayed by its topic-neutrality. It supplements deductive coherence with the logical demand for probabilistic coherence among one’s degrees of belief. But this extended sense of coherence is (almost) silent on what an agent must infer or believe.
Does drinking cause you to become a man?
26 Jan, 2022 at 19:16 | Posted in Statistics & Econometrics | 2 CommentsBreaking news! Using advanced multiple nonlinear regression models similar to those in recent news stories on alcohol and dairy and more than 3.6M observations from 1997 through 2012, I have found that drinking more causes people to turn into men!
Across people drinking 0-7 drinks per day, each drink per day causes the drinker’s probability of being a man to increase by 10.02 percentage points (z=302.2, p<0.0001). Need I say, profound implications for public health policy follow. The Lancet here I come!
Econometric identification sure is difficult …
Great tweet!
Laplace’s rule of succession and Bayesian priors
26 Jan, 2022 at 15:37 | Posted in Statistics & Econometrics | 5 CommentsAfter their first night in paradise, and having seen the sun rise in the morning, Adam and Eve was wondering if they were to experience another sunrise or not. Given the rather restricted sample of sunrises experienced, what could they expect? According to Laplace’s rule of succession, the probability of an event E happening after it has occurred n times is p(E|n) = (n+1)/(n+2).
The probabilities can be calculated using Bayes’ rule, but to get the calculations going, Adam and Eve must have an a priori probability (a base rate) to start with. The Bayesian rule of thumb is to simply assume that all outcomes are equally likely. Applying this rule Adam’s and Eve’s probabilities become 1/2, 2/3, 3/4 …
Now this might seem rather straight forward, but as already e. g. Keynes (1921) noted in his Treatise on Probability, there might be a problem here. The problem has to do with the prior probability and where it is assumed to come from. Is the appeal of the principle of insufficient reason — the principle of indifference — really warranted?
Assume there is a certain quantity of liquid containing wine and water mixed so that the ratio of wine to water (r) is between 1/3 and 3/1. What is then the probability that r ≤ 2? The principle of insufficient reason means that we have to treat all r-values as equiprobable, assigning a uniform probability distribution between 1/3 and 3/1, which gives the probability of r ≤ 2 = [(2-1/3)/(3-1/3)] = 5/8.
But to say r ≤ 2 is equivalent to saying that 1/r ≥ ½. Applying the principle now, however, gives the probability of 1/r ≥ 1/2 = [(3-1/2)/(3-1/3)]=15/16. So we seem to get two different answers that both follow from the same application of the principle of insufficient reason. Given this unsolved paradox, we have reason to stick with Keynes and be skeptical of Bayesianism.
Beta distribution (student stuff)
25 Jan, 2022 at 09:15 | Posted in Statistics & Econometrics | Comments Off on Beta distribution (student stuff).
He ain’t heavy, he’s my brother
24 Jan, 2022 at 15:29 | Posted in Varia | 1 Comment.
In loving memory of my brother Peter.
Twenty years have passed.
People say time heals all wounds.
I wish that was true.
But some wounds never heal — you just learn to live with the scars.
But in dreams,
I can hear your name.
And in dreams,
We will meet again.When the seas and mountains fall
And we come to end of days,
In the dark I hear a call
Calling me there
I will go there
And back again.
Models and the need to validate assumptions
24 Jan, 2022 at 09:09 | Posted in Economics | 1 CommentPiketty argues that the higher income share of wealth-owners is due to an increase in the capital-output ratio resulting from a high rate of capital accumulation. The evidence suggests just the contrary. The capital-output ratio, as conventionally measured has either fallen or been constant in recent decades. The apparent increase in the capital-output ratio identified by Piketty is a valuation effect reflecting a disproportionate increase in the market value of certain real assets. A more plausible explanation for the increased income share of wealth-owners is an unduly low rate of investment in real capital.
Say we have a diehard neoclassical model (assuming the production function is homogeneous of degree one and unlimited substitutability) such as the standard Cobb-Douglas production function (with A a given productivity parameter, and k the ratio of capital stock to labor, K/L) y = Akα , with a constant investment λ out of output y and a constant depreciation rate δ of the “capital per worker” k, where the rate of accumulation of k, Δk = λy– δk, equals Δk = λAkα– δk. In steady state (*) we have λAk*α = δk*, giving λ/δ = k*/y* and k* = (λA/δ)1/(1-α). Putting this value of k* into the production function, gives us the steady state output per worker level y* = Ak*α= A1/(1-α)(λ/δ))α/(1-α). Assuming we have an exogenous Harrod-neutral technological progress that increases y with a growth rate g (assuming a zero labour growth rate and with y and k a fortiori now being refined as y/A and k/A respectively, giving the production function as y = kα) we get dk/dt = λy – (g + δ)k, which in the Cobb-Douglas case gives dk/dt = λkα– (g + δ)k, with steady state value k* = (λ/(g + δ))1/(1-α) and capital-output ratio k*/y* = k*/k*α = λ/(g + δ). If using Piketty’s preferred model with output and capital given net of depreciation, we have to change the final expression into k*/y* = k*/k*α = λ/(g + λδ). Now what Piketty predicts is that g will fall and that this will increase the capital-output ratio. Let’s say we have δ = 0.03, λ = 0.1 and g = 0.03 initially. This gives a capital-output ratio of around 3. If g falls to 0.01 it rises to around 7.7. We reach analogous results if we use a basic CES production function with an elasticity of substitution σ > 1. With σ = 1.5, the capital share rises from 0.2 to 0.36 if the wealth-income ratio goes from 2.5 to 5, which according to Piketty is what actually has happened in rich countries during the last forty years.
Being able to show that you can get these results using one or another of the available standard neoclassical growth models is of course — from a realist point of view — of limited value. As usual — the really interesting thing is how in accord with reality are the assumptions you make and the numerical values you put into the model specification.
Professor Piketty chose a theoretical framework that simultaneously allowed him to produce catchy numerical predictions, in tune with his empirical findings, while soaring like an eagle above the ‘messy’ debates of political economists shunned by their own profession’s mainstream and condemned diligently to inquire, in pristine isolation, into capitalism’s radical indeterminacy. The fact that, to do this, he had to adopt axioms that are both grossly unrealistic and logically incoherent must have seemed to him a small price to pay.
DAG thinking (student stuff)
21 Jan, 2022 at 10:30 | Posted in Statistics & Econometrics | 2 Comments.
Mainstream economics in denial
20 Jan, 2022 at 11:08 | Posted in Economics | 1 CommentWe’d gathered at Downing College, Cambridge, to discuss the economic crisis, although the quotidian misery of that topic seemed a world away from the honeyed quads and endowment plush of this place.
Equally incongruous were the speakers. The Cambridge economist Victoria Bateman looked as if saturated fat wouldn’t melt in her mouth, yet demolished her colleagues. They’d been stupidly cocky before the crash – remember the 2003 boast from Nobel prizewinner Robert Lucas that the “central problem of depression-prevention has been solved”? – and had learned no lessons since. Yet they remained the seers of choice for prime ministers and presidents. She ended: “If you want to hang anyone for the crisis, hang me – and my fellow economists.”
What followed was angry agreement. On the night before the latest growth figures, no one in this 100-strong hall used the word “recovery” unless it was to be sarcastic. Instead, audience members – middle-aged, smartly dressed and doubtless sizably mortgaged – took it in turn to attack bankers, politicians and, yes, economists. They’d created the mess everyone else was paying for, yet they’d suffered no retribution …
Yet look around at most of the major economics degree courses and neoclassical economics – that theory that treats humans as walking calculators, all-knowing and always out for themselves, and markets as inevitably returning to stability – remains in charge. Why? In a word: denial. The high priests of economics refuse to recognise the world has changed.
In his new book, Never Let a Serious Crisis Go to Waste, the US economist Philip Mirowski recounts how a colleague at his university was asked by students in spring 2009 to talk about the crisis. The world was apparently collapsing around them, and what better forum to discuss this in than a macroeconomics class. The response? “The students were curtly informed that it wasn’t on the syllabus, and there was nothing about it in the assigned textbook, and the instructor therefore did not wish to diverge from the set lesson plan. And he didn’t.”
Blog at WordPress.com.
Entries and Comments feeds.