A quick refresher on mathematical induction (student stuff)

15 Jun, 2013 at 10:28 | Posted in Statistics & Econometrics | Comments Off on A quick refresher on mathematical induction (student stuff)

 

The Swedish school system – curriculum alignment and the skilful art of juking the stats

15 Jun, 2013 at 09:24 | Posted in Education & School | Comments Off on The Swedish school system – curriculum alignment and the skilful art of juking the stats

 

 
When being visited by international colleagues, discussions often turn into questions of how our different national educational systems operate. One thing that for sure gobsmacks my learned friends is when yours truly mention that the national tests in Sweden are secret after they’ve been held, but that the teachers can be in possession of them long before their pupils write them and that the grading of the pupils’ tests are done by their own teachers …

Likt en sländas spröda vinge ögat skälver

14 Jun, 2013 at 23:45 | Posted in Varia | Comments Off on Likt en sländas spröda vinge ögat skälver

 

Yours truly on a roll

14 Jun, 2013 at 14:22 | Posted in Varia | 6 Comments

boyjumpingYours truly launched this blog two years ago. The number of visitors has increased steadily. From having only a couple of hundred visits per month at the start, I’m now having almost
70 000 visits per month. A blog is sure not a beauty contest, but given the rather “wonkish” character of the blog – with posts mostly on economic theory, statistics, econometrics, theory of science and methodology – it’s rather gobsmacking that so many are interested and take their time to read and comment on it. I am – of course – truly awed, honoured and delighted!

What is neoclassical economics – a rejoinder to Noahpinion

14 Jun, 2013 at 11:07 | Posted in Economics | 22 Comments

neoclassical-economics-with-related-tags-and-termsNoah Smith – on his blog Noahpinion – writes that when yours truly and other heterodox economists criticize neoclassical economics

the term “neoclassical” gets slung around quite a lot, usually as a perjorative (sic!) … The idea is that “neoclassical” econ is the dominant paradigm, and that the “heterodox” schools are competing paradigms that lost out, and were, to use Kuhn’s terminology, “simply read out of the profession…and subsequently ignored.”

He then goes on to define neoclassical economics with the help of Wikipedia:

“Neoclassical economics is a term variously used for approaches to economics focusing on the determination of prices, outputs, and income distributions in markets through supply and demand, often mediated through a hypothesized maximization of utility by income-constrained individuals and of profits by cost-constrained firms employing available information and factors of production, in accordance with rational choice theory.”

OK, makes sense. Assumption of individual rationality, utility maximization, and supply/demand. One or more of things terms probably describes most of mainstream economics theory.

The basic problem with this definition of neoclassical economics – basically arguing that  the differentia specifica of neoclassical economics is its use of demand and supply, utility maximization and rational choice – is that it doesn’t get things quite right. As we all know, there is an endless list of mainstream models that more or less distance themselves from one or the other of these characteristics. So the heart of neoclassical economic theory lies elsewhere.

The essence of neoclassical economic theory is its exclusive use of a deductivist Euclidean methodology. A methodology  that is more or less imposed as constituting economics, and, usually, without a smack of argument.

The theories and models that neoclassical economists construct describe imaginary worlds using a combination of formal sign systems such as mathematics and ordinary language. The descriptions made are extremely thin and to a large degree disconnected to the specific contexts of the targeted system than one (usually) wants to (partially) represent. This is not by chance. These closed formalistic-mathematical theories and models are constructed for the purpose of being able to deliver purportedly rigorous deductions that may somehow by be exportable to the target system. By analyzing a few causal factors in their “laboratories” they hope they can perform “thought experiments” and observe how these factors operate on their own and without impediments or confounders.

Unfortunately, this is not so. The reason for this is that economic causes never act in a socio-economic vacuum. Causes have to be set in a contextual structure to be able to operate. This structure has to take some form or other, but instead of incorporating structures that are true to the target system, the settings made in economic models are rather based on formalistic mathematical tractability. In the models they appear as unrealistic assumptions, usually playing a decisive role in getting the deductive machinery deliver “precise” and “rigorous” results. This, of course, makes exporting to real world target systems problematic, since these models – as part of a deductivist covering-law tradition in economics – are thought to deliver general and far-reaching conclusions that are externally valid. But how can we be sure the lessons learned in these theories and models have external validity, when based on highly specific unrealistic assumptions? As a rule, the more specific and concrete the structures, the less generalizable the results. Admitting that we in principle can move from (partial) falsehoods in theories and models to truth in real world target systems does not take us very far, unless a thorough explication of the relation between theory, model and the real world target system is made. If models assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. To have a deductive warrant for things happening in a closed model is no guarantee for them being preserved when applied to an open real world target system.

Henry Louis Mencken once wrote that “there is always an easy solution to every human problem – neat, plausible and wrong.” And neoclassical economics has indeed been wrong. Its main result, so far, has been to demonstrate the futility of trying to build a satisfactory bridge between formalistic-axiomatic deductivist models and real world target systems. Assuming, for example, perfect knowledge, instant market clearing and approximating aggregate behaviour with unrealistically heroic assumptions of representative actors, just will not do. The assumptions made, surreptitiously eliminate the very phenomena we want to study: uncertainty, disequilibrium, structural instability and problems of aggregation and coordination between different individuals and groups.

The punch line is that most of the problems that neoclassical economics is wrestling with, issues from its attempts at formalistic modeling per se of social phenomena. Reducing microeconomics to refinements of hyper-rational Bayesian deductivist models is not a viable way forward. It will only sentence to irrelevance the most interesting real world economic problems. And as someone has so wisely remarked, murder is unfortunately the only way to reduce biology to chemistry – reducing macroeconomics to Walrasian general equilibrium microeconomics basically means committing the same crime.

If scientific progress in economics – as Robert Lucas and other latter days neoclassical economists seem to think – lies in our ability to tell “better and better stories” without considering the realm of imagination and ideas a retreat from real world target systems reality, one would of course think our economics journal being filled with articles supporting the stories with empirical evidence. However, contrary to Noah Smith, I would argue that the journals still show a striking and embarrassing paucity of empirical studies that (try to) substantiate these theoretical claims. Equally amazing is how little one has to say about the relationship between the model and real world target systems. It is as though thinking explicit discussion, argumentation and justification on the subject not required. Neoclassical economic theory is obviously navigating in dire straits.

The latest financial-economic crisis has definitely shown the shortcomings of neoclassical economic theory. What went wrong, according to Paul Krugman, was basically that economists “mistook beauty, clad in impressive-looking mathematics, for truth.” This is certainly true as far as it goes. But it doesn’t go deep enough. Mathematics is just a means towards the goal – modeling the economy as a closed deductivist system.

If the ultimate criteria of success of a deductivist system is to what extent it predicts and coheres with (parts of) reality, modern neoclassical economics seems to be a hopeless misallocation of scientific resources. To focus scientific endeavours on proving things in models, is a gross misapprehension of what an economic theory ought to be about. Deductivist models and methods disconnected from reality are not relevant to predict, explain or understand real world economic target systems. These systems do not conform to the restricted closed-system structure the neoclassical modeling strategy presupposes.

Neoclassical economic theory still today consists mainly in investigating economic models. It has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in neoclassical economic theory, where models largely function as substitutes for empirical evidence.

What is wrong with neoclassical economics is not that it employs models per se, but that it employs poor models. They are poor because they do not bridge to the real world target system in which we live. But “facts kick”, and hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on mathematical deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.

What stops economists exploring new ideas

13 Jun, 2013 at 17:10 | Posted in Economics | 2 Comments

There are plenty of economists who will happily admit the limits of their discipline, and be nominally open to the idea of other theories. However, I find that when pushed on this, they reveal that they simply cannot think any other way than roughly along the lines of neoclassical economics. My hypothesis is that this is because economist’s approach has a ‘neat and tidy’ feel to it: people are ‘well-behaved’; markets tend to clear, people are, on average, right about things, and so forth. Therefore, economist’s immediate reaction to criticisms is “if not our approach, then what? It would be modelling anarchy!” …
 
msg-new-way-of-thinking
 
The economist’s mentality extends up to the highest echelons of economics modelling, and culminates in the ‘DSGE or die’ approach, described well on Noah Smith’s blog by Roger Farmer:

“If one takes the more normal use of disequilibrium to mean agents trading at non-Walrasian prices, … I do not think we should revisit that agenda. Just as in classical and new-Keynesian models where there is a unique equilibrium, the concept of disequilibrium in multiple equilibrium models is an irrelevant distraction.”

When questioned about his approach, Farmer would probably suggest that if we do not assume markets tend to clear, and that agents are, on average, correct, then what exactly do we assume? A harsh evaluation would be to suggest this is really an argument from personal incredulity. There is simply no need to assume markets tend to clear to build a theory – John Maynard Keynes showed us as much in The General Theory, a book economists seem to have a hard time understanding precisely because it doesn’t fit their approach. Furthermore, the physical sciences have shown us that systems can be chaotic but model-able, and even follow recognisable paths …

Ultimately, the only thing stopping economists exploring new ideas is economists. There is a wide breadth of non-equilibrium, non-market clearing and non-rational modelling going on. Economists have a stock of reasons that these are wrong: the Lucas Critique, Milton Friedman’s methodology, the ‘as if‘ argument and so forth. Yet they often fail to listen to the counterarguments to these points and simply use them to defer to their preferred approach. If economists really want to broaden the scope of the discipline rather than merely tweaking it around the edges, they must be prepared to understand how alternative approaches work, and why they can be valid. Otherwise they will continue to give the impression – right or wrong – of ivory tower intellectuals, completely out of touch with reality and closed off from new ideas.

Unlearning Economics

Chebyshev’s Inequality Theorem (student stuff)

12 Jun, 2013 at 16:21 | Posted in Statistics & Econometrics | Comments Off on Chebyshev’s Inequality Theorem (student stuff)

Chebyshev’s Inequality Theorem – named after Russian mathematician Pafnuty Chebyshev – states that for a population (or sample) at most 1/kof the distribution’s values can be more than k standard deviations away from the mean. The beauty of the theorem is that although we may not know the exact distribution of the data – e.g. if it’s normally distributed  – we may still say with certitude (since the theorem holds universally)  that there are bounds on probabilities!

A quick refresher on Cumulative Distribution Functions (student stuff)

11 Jun, 2013 at 09:44 | Posted in Statistics & Econometrics | Comments Off on A quick refresher on Cumulative Distribution Functions (student stuff)

 

Unemployment benefits and speed limits

10 Jun, 2013 at 19:53 | Posted in Economics | Comments Off on Unemployment benefits and speed limits

One way to think about this is to say that unemployment benefits may, perhaps, reduce the economy’s speed limit, if we think of speed as inversely related to unemployment. And this suggests an analogy.speed_limitImagine that you’re driving along a stretch of highway where the legal speed limit is 55 miles an hour. Unfortunately, however, you’re caught in a traffic jam, making an average of just 15 miles an hour. And the guy next to you says, “I blame those bureaucrats at the highway authority — if only they would raise the speed limit to 65, we’d be going 10 miles an hour faster.”

Dumb, right? Well, so is the claim that unemployment benefits are causing today’s high unemployment.

Paul Krugman

Fun with statistics

10 Jun, 2013 at 15:25 | Posted in Statistics & Econometrics | 1 Comment

Yours truly gave a PhD course in statistics for students in education and sports this semester. And between teaching them all about Chebyshev’s Theorem, Beta Distributions, Moment-Generating Functions and the Neyman-Pearson Lemma, I tried to remind them that statistics can actually also be fun …
 

Award the 2013 Nobel Peace Prize to whistleblower Bradley Manning

9 Jun, 2013 at 19:13 | Posted in Politics & Society | Comments Off on Award the 2013 Nobel Peace Prize to whistleblower Bradley Manning

If you witnessed war crimes, if you saw incredible things, awful things, things that belonged in the public domain and not in some server stored in a dark room in Washington, what would you do?
 

Austerity policies – total horseshit

9 Jun, 2013 at 18:36 | Posted in Economics | Comments Off on Austerity policies – total horseshit

 

Microfoundations – neither laws, nor true

8 Jun, 2013 at 17:27 | Posted in Economics | 2 Comments

Oxford professor Simon Wren-Lewis doesn’t agree with Paul Krugman’s statement that

the whole microfoundations crusade is based on one predictive success some 35 years ago; there have been no significant payoffs since.

Why does Wren-Lewis disagree? He writes:

I think the two most important microfoundation led innovations in macro have been intertemporal consumption and rational expectations. I have already talked about the former in an earlier post … [s]o let me focus on rational expectations …  [T]he adoption of rational expectations was not the result of some previous empirical failure. Instead it represented, as Lucas said, a consistency axiom …

I think macroeconomics today is much better than it was 40 years ago as a result of the microfoundations approach. I also argued in my previous post that a microfoundations purist position – that this is the only valid way to do macro – is a mistake. The interesting questions are in between. Can the microfoundations approach embrace all kinds of heterogeneity, or will such models lose their attractiveness in their complexity? Does sticking with simple, representative agent macro impart some kind of bias? Does a microfoundations approach discourage investigation of the more ‘difficult’ but more important issues? Might both these questions suggest a link between too simple a micro based view and a failure to understand what was going on before the financial crash? Are alternatives to microfoundations modelling methodologically coherent? Is empirical evidence ever going to be strong and clear enough to trump internal consistency? These are difficult and often quite subtle questions that any simplistic for and against microfoundations debate will just obscure.

On this argumentation I would like to add the following comments:

(1) The fact that Lucas introduced rational expectatuions as a consistency axiom is not really an argument to why we should accept it as an acceptable assumption in a theory or model purporting to explain real macroeconomic processes (see e. g. Robert Lucas, rational expectations, and the understanding of business cycles).

(2) “Now virtually any empirical claim in macro is contestable” Wren-Lewis writes. Yes, but so is virtually also any claim in micro (see e. g. When the model is the message – modern neoclassical economics).

(3) To the two questions “Can the microfoundations approach embrace all kinds of heterogeneity, or will such models lose their attractiveness in their complexity?” and “Does sticking with simple, representative agent macro impart some kind of bias?” I would unequivocally answer yes (I have given the reasons why e. g. in David Levine is totally wrong on the rational expectations hypothesis , so I will not repeat the argumentation here).

(4) “Are alternatives to microfoundations modelling methodologically coherent?” Well, I don’t know. But one thing I do  know, is that the kind of miocrofoundationalist macroeconomics that New Classical economists in the vein of Lucas and Sargent and the so called New Keynesian economists in the vein of Mankiw et consortes are pursuing, are not methodologically coherent (as I have argued e. g. in What is (wrong with) economic theory?) And that ought to be rather embarrassing for those ilks of macroeconomists to whom axiomatics and deductivity is the hallmark of science tout court.

So in the Wren-Lewis – Krugman discussion on microfoundations I think Krugman is closer to truth with his remark that

what we call “microfoundations” are not like physical laws. Heck, they’re not even true.

A quick refresher on Probability Density Functions (student stuff)

8 Jun, 2013 at 09:31 | Posted in Statistics & Econometrics | Comments Off on A quick refresher on Probability Density Functions (student stuff)

 

Krugman – more Wicksell than Keynes

6 Jun, 2013 at 20:15 | Posted in Economics | 4 Comments

In a recent blogpost Paul Krugman comes back to his idea that it would be great if the Fed stimulated inflationary expectations so that investments would increase. I don’t have any problem with this idea per se, but I don’t think it’s of the stature that Krugman seems to think. But although I have written extensively on Knut Wicksell and consider him the greatest Swedish economist ever, I definitely – since Krugman portrays himself as “sorta-kinda Keynesian” – have to question his invocation of Knut Wicksell for his ideas on the “natural” rate of interest. Krugman writes (emphasis added):

Start with the very simplest view of how Fed policy affects the economy: the Fed sets short-term interest rates, and other things equal a lower rate leads to higher output; the “natural rate” of interest … is the rate at which output equals potential, that is, at which there are neither inflationary nor deflationary pressures …

What does this tell us? First of all, that there is nothing “artificial” or “unnatural” about low interest rates; they’re low because demand is low, and the Fed is responding appropriately. If anything, the “unnatural” situation is that rates are too high, because they’re constrained by the zero lower bound (rates can’t go below zero, except for some minor technical bobbles, because people can always just hold cash).

wicksell3Second, the Fed’s inability to get rates as low as they should be justifies a search for policies that can fill this policy gap. Fiscal stimulus is one such policy; unconventional monetary policies of various kinds are another. Actually, the natural policy — natural in a Wicksellian sense, and also the one that in terms of standard economics should produce the least distortion — would be a credible commitment to higher inflation.

Now consider what Keynes himself wrote in General Theory:

In my Treatise on Money I defined what purported to be a unique rate of interest, which I called the natural rate of interest¾namely, the rate of interest which, in the terminology of my Treatise, preserved equality between the rate of saving (as there defined) and the rate of investment. I believed this to be a development and clarification of Wicksell’s ‘natural rate of interest’, which was, according to him, the rate which would preserve the stability of some, not quite clearly specified, price-level.

I had, however, overlooked the fact that in any given society there is, on this definition, a different natural rate of interest for each hypothetical level of employment. And, similarly, for every rate of interest there is a level of employment for which that rate is the ‘natural’ rate, in the sense that the system will be in equilibrium with that rate of interest and that level of employment. Thus it was a mistake to speak of the natural rate of interest or to suggest that the above definition would yield a unique value for the rate of interest irrespective of the level of employment. I had not then understood that, in certain conditions, the system could be in equilibrium with less than full employment.

I am now no longer of the opinion that the [Wicksellian] concept of a ‘natural’ rate of interest, which previously seemed to me a most promising idea, has anything very useful or significant to contribute to our analysis. It is merely the rate of interest which will preserve the status quo; and, in general, we have no predominant interest in the status quo as such.

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.