What stops economists exploring new ideas

13 juni, 2013 kl. 17:10 | Publicerat i Economics | 2 kommentarer

There are plenty of economists who will happily admit the limits of their discipline, and be nominally open to the idea of other theories. However, I find that when pushed on this, they reveal that they simply cannot think any other way than roughly along the lines of neoclassical economics. My hypothesis is that this is because economist’s approach has a ‘neat and tidy’ feel to it: people are ‘well-behaved’; markets tend to clear, people are, on average, right about things, and so forth. Therefore, economist’s immediate reaction to criticisms is “if not our approach, then what? It would be modelling anarchy!” …
 
msg-new-way-of-thinking
 
The economist’s mentality extends up to the highest echelons of economics modelling, and culminates in the ‘DSGE or die’ approach, described well on Noah Smith’s blog by Roger Farmer:

”If one takes the more normal use of disequilibrium to mean agents trading at non-Walrasian prices, … I do not think we should revisit that agenda. Just as in classical and new-Keynesian models where there is a unique equilibrium, the concept of disequilibrium in multiple equilibrium models is an irrelevant distraction.”

When questioned about his approach, Farmer would probably suggest that if we do not assume markets tend to clear, and that agents are, on average, correct, then what exactly do we assume? A harsh evaluation would be to suggest this is really an argument from personal incredulity. There is simply no need to assume markets tend to clear to build a theory – John Maynard Keynes showed us as much in The General Theory, a book economists seem to have a hard time understanding precisely because it doesn’t fit their approach. Furthermore, the physical sciences have shown us that systems can be chaotic but model-able, and even follow recognisable paths …

Ultimately, the only thing stopping economists exploring new ideas is economists. There is a wide breadth of non-equilibrium, non-market clearing and non-rational modelling going on. Economists have a stock of reasons that these are wrong: the Lucas Critique, Milton Friedman’s methodology, the ‘as if‘ argument and so forth. Yet they often fail to listen to the counterarguments to these points and simply use them to defer to their preferred approach. If economists really want to broaden the scope of the discipline rather than merely tweaking it around the edges, they must be prepared to understand how alternative approaches work, and why they can be valid. Otherwise they will continue to give the impression – right or wrong – of ivory tower intellectuals, completely out of touch with reality and closed off from new ideas.

Unlearning Economics

Chebyshev’s Inequality Theorem (student stuff)

12 juni, 2013 kl. 16:21 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för Chebyshev’s Inequality Theorem (student stuff)

Chebyshev’s Inequality Theorem – named after Russian mathematician Pafnuty Chebyshev – states that for a population (or sample) at most 1/kof the distribution’s values can be more than k standard deviations away from the mean. The beauty of the theorem is that although we may not know the exact distribution of the data – e.g. if it’s normally distributed  – we may still say with certitude (since the theorem holds universally)  that there are bounds on probabilities!

A quick refresher on Cumulative Distribution Functions (student stuff)

11 juni, 2013 kl. 09:44 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för A quick refresher on Cumulative Distribution Functions (student stuff)

 

Unemployment benefits and speed limits

10 juni, 2013 kl. 19:53 | Publicerat i Economics | Kommentarer inaktiverade för Unemployment benefits and speed limits

One way to think about this is to say that unemployment benefits may, perhaps, reduce the economy’s speed limit, if we think of speed as inversely related to unemployment. And this suggests an analogy.speed_limitImagine that you’re driving along a stretch of highway where the legal speed limit is 55 miles an hour. Unfortunately, however, you’re caught in a traffic jam, making an average of just 15 miles an hour. And the guy next to you says, “I blame those bureaucrats at the highway authority — if only they would raise the speed limit to 65, we’d be going 10 miles an hour faster.”

Dumb, right? Well, so is the claim that unemployment benefits are causing today’s high unemployment.

Paul Krugman

Fun with statistics

10 juni, 2013 kl. 15:25 | Publicerat i Statistics & Econometrics | 1 kommentar

Yours truly gave a PhD course in statistics for students in education and sports this semester. And between teaching them all about Chebyshev’s Theorem, Beta Distributions, Moment-Generating Functions and the Neyman-Pearson Lemma, I tried to remind them that statistics can actually also be fun …
 

Award the 2013 Nobel Peace Prize to whistleblower Bradley Manning

9 juni, 2013 kl. 19:13 | Publicerat i Politics & Society | Kommentarer inaktiverade för Award the 2013 Nobel Peace Prize to whistleblower Bradley Manning

If you witnessed war crimes, if you saw incredible things, awful things, things that belonged in the public domain and not in some server stored in a dark room in Washington, what would you do?
 

Austerity policies – total horseshit

9 juni, 2013 kl. 18:36 | Publicerat i Economics | Kommentarer inaktiverade för Austerity policies – total horseshit

 

Microfoundations – neither laws, nor true

8 juni, 2013 kl. 17:27 | Publicerat i Economics | 2 kommentarer

Oxford professor Simon Wren-Lewis doesn’t agree with Paul Krugman’s statement that

the whole microfoundations crusade is based on one predictive success some 35 years ago; there have been no significant payoffs since.

Why does Wren-Lewis disagree? He writes:

I think the two most important microfoundation led innovations in macro have been intertemporal consumption and rational expectations. I have already talked about the former in an earlier post … [s]o let me focus on rational expectations …  [T]he adoption of rational expectations was not the result of some previous empirical failure. Instead it represented, as Lucas said, a consistency axiom …

I think macroeconomics today is much better than it was 40 years ago as a result of the microfoundations approach. I also argued in my previous post that a microfoundations purist position – that this is the only valid way to do macro – is a mistake. The interesting questions are in between. Can the microfoundations approach embrace all kinds of heterogeneity, or will such models lose their attractiveness in their complexity? Does sticking with simple, representative agent macro impart some kind of bias? Does a microfoundations approach discourage investigation of the more ‘difficult’ but more important issues? Might both these questions suggest a link between too simple a micro based view and a failure to understand what was going on before the financial crash? Are alternatives to microfoundations modelling methodologically coherent? Is empirical evidence ever going to be strong and clear enough to trump internal consistency? These are difficult and often quite subtle questions that any simplistic for and against microfoundations debate will just obscure.

On this argumentation I would like to add the following comments:

(1) The fact that Lucas introduced rational expectatuions as a consistency axiom is not really an argument to why we should accept it as an acceptable assumption in a theory or model purporting to explain real macroeconomic processes (see e. g. Robert Lucas, rational expectations, and the understanding of business cycles).

(2) ”Now virtually any empirical claim in macro is contestable” Wren-Lewis writes. Yes, but so is virtually also any claim in micro (see e. g. When the model is the message – modern neoclassical economics).

(3) To the two questions ”Can the microfoundations approach embrace all kinds of heterogeneity, or will such models lose their attractiveness in their complexity?” and ”Does sticking with simple, representative agent macro impart some kind of bias?” I would unequivocally answer yes (I have given the reasons why e. g. in David Levine is totally wrong on the rational expectations hypothesis , so I will not repeat the argumentation here).

(4) ”Are alternatives to microfoundations modelling methodologically coherent?” Well, I don’t know. But one thing I do  know, is that the kind of miocrofoundationalist macroeconomics that New Classical economists in the vein of Lucas and Sargent and the so called New Keynesian economists in the vein of Mankiw et consortes are pursuing, are not methodologically coherent (as I have argued e. g. in What is (wrong with) economic theory?) And that ought to be rather embarrassing for those ilks of macroeconomists to whom axiomatics and deductivity is the hallmark of science tout court.

So in the Wren-Lewis – Krugman discussion on microfoundations I think Krugman is closer to truth with his remark that

what we call “microfoundations” are not like physical laws. Heck, they’re not even true.

A quick refresher on Probability Density Functions (student stuff)

8 juni, 2013 kl. 09:31 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för A quick refresher on Probability Density Functions (student stuff)

 

Krugman – more Wicksell than Keynes

6 juni, 2013 kl. 20:15 | Publicerat i Economics | 4 kommentarer

In a recent blogpost Paul Krugman comes back to his idea that it would be great if the Fed stimulated inflationary expectations so that investments would increase. I don’t have any problem with this idea per se, but I don’t think it’s of the stature that Krugman seems to think. But although I have written extensively on Knut Wicksell and consider him the greatest Swedish economist ever, I definitely – since Krugman portrays himself as ”sorta-kinda Keynesian” – have to question his invocation of Knut Wicksell for his ideas on the ”natural” rate of interest. Krugman writes (emphasis added):

Start with the very simplest view of how Fed policy affects the economy: the Fed sets short-term interest rates, and other things equal a lower rate leads to higher output; the “natural rate” of interest … is the rate at which output equals potential, that is, at which there are neither inflationary nor deflationary pressures …

What does this tell us? First of all, that there is nothing “artificial” or “unnatural” about low interest rates; they’re low because demand is low, and the Fed is responding appropriately. If anything, the “unnatural” situation is that rates are too high, because they’re constrained by the zero lower bound (rates can’t go below zero, except for some minor technical bobbles, because people can always just hold cash).

wicksell3Second, the Fed’s inability to get rates as low as they should be justifies a search for policies that can fill this policy gap. Fiscal stimulus is one such policy; unconventional monetary policies of various kinds are another. Actually, the natural policy — natural in a Wicksellian sense, and also the one that in terms of standard economics should produce the least distortion — would be a credible commitment to higher inflation.

Now consider what Keynes himself wrote in General Theory:

In my Treatise on Money I defined what purported to be a unique rate of interest, which I called the natural rate of interest¾namely, the rate of interest which, in the terminology of my Treatise, preserved equality between the rate of saving (as there defined) and the rate of investment. I believed this to be a development and clarification of Wicksell’s ‘natural rate of interest’, which was, according to him, the rate which would preserve the stability of some, not quite clearly specified, price-level.

I had, however, overlooked the fact that in any given society there is, on this definition, a different natural rate of interest for each hypothetical level of employment. And, similarly, for every rate of interest there is a level of employment for which that rate is the ‘natural’ rate, in the sense that the system will be in equilibrium with that rate of interest and that level of employment. Thus it was a mistake to speak of the natural rate of interest or to suggest that the above definition would yield a unique value for the rate of interest irrespective of the level of employment. I had not then understood that, in certain conditions, the system could be in equilibrium with less than full employment.

I am now no longer of the opinion that the [Wicksellian] concept of a ‘natural’ rate of interest, which previously seemed to me a most promising idea, has anything very useful or significant to contribute to our analysis. It is merely the rate of interest which will preserve the status quo; and, in general, we have no predominant interest in the status quo as such.

History repeats itself, first as tragedy, second as farce

6 juni, 2013 kl. 18:51 | Publicerat i Economics, Politics & Society | 4 kommentarer

These are the days: I stopped reading ‘The economic consequences of the peace’ to read the IMF report on Greece. Did anything change (emphasis added)?

The IMF on Greece, 2013:

One way to make the debt outlook more sustainable would have been to attempt to restructure the debt from the beginning. However, PSI was not part of the original program. This was in contrast with the Fund program in Uruguay in 2002 and Jamaica in 2011 where PSI was announced upfront … Yet in Greece, on the eve of the program, the authorities dismissed debt restructuring as a “red herring” that was off the table for the Greek government and had not been proposed by the Fund … In fact, debt restructuring had been considered by the parties to the negotiations but had been ruled out by the euro area … Some Eurozone partners emphasized moral hazard arguments against restructuring. A rescue package for Greece that incorporated debt restructuring would likely have difficulty being approved, as would be necessary, by all the euro area parliaments … Nonetheless, many commentators considered debt restructuring to be inevitable. With debt restructuring off the table, Greece faced two alternatives: default immediately, or move ahead as if debt restructuring could be avoided. The latter strategy was adopted, but in the event, this only served to delay debt restructuring and allowed many private creditors to escape.

Keynes, ‘The economic consequences of the peace’, about the negotiations in 1919:

As soon as it was admitted that it was in fact impossible to make Germany pay the expenses of both sides, and that the unloading of their liabilities upon the enemy was not practicable, the position of the Ministers of Finance of France and Italy became untenable. Thus a scientific consideration of Germany’s capacity to pay was from the outset out of court. The expectations which the exigencies of politics had made it necessary to raise were so very remote from the truth that a slight distortion of figures was no use, and it was necessary to ignore the facts entirely. The resulting unveracity was fundamental. On a basis of so much falsehood it became impossible to erect any constructive financial policy which was workable. For this reason amongst others, a magnanimous financial policy was essential.

Real-World Economics Review Blog

Barkley Rosser on the crisis predictions debate

6 juni, 2013 kl. 09:59 | Publicerat i Economics | 1 kommentar

I probably should not dredge around in this further, but there has been in the last week or so a major outpouring of dicussion about ”who predicted the crisis,” with a lot of wrangling and some less than pleasant comments being strewn about. I got dragged into it and should probably leave it alone, but more keeps coming, and I also think there might be a link to developments here at this blog.

Anyway, the main volley and still the main center of this discussion was set off by Noah Smith … He laid out three conditions for having really predicted it fully, actually four. The main three were to have called the housing bubble, then called the broader financial market collapse deriving from the collapse of the housing bubble, and then to have called that there would be deep recession with a long stagnation afterwards. Oh, and to really qualify for him, one needed to do this with a fully specified model based on data. He basically argued that nobody fully satisfied all these criteria, although he makes lots of favorable remarks about Dean Baker, accurately crediting him, I think, with having first called the housing bubble back in 2002, and also arguing that its collapse would lead to a recession, although not necessarily a broader financial collapse nor how deep the recession would be, and also he did not have a fully specified data-based model for his forecasts, although he did use data. I basically agree with this.

Now most of the rest of the post, after throwing out a few other names (not including me), focused on Steve Keen, who has quite prominently claimed to have called the crisis. Noah basically jumped all over Steve, even confessing to not liking him personally based on his tweets. He admits that Steve had an interesting Minsky- based model in 1995, which I happen to be a fan of, but argued that it was not based on data and had various characteristics (such as lots of cyclicity) that made it not all that good for actually forecasting the crisis in the way he prescribed. At least some of this is true, although I commented on the thread that I thought he was overdoing the harshness of his criticisms of Steve, particularly the personal ones. I have always found Steve to be personable and lots of fun in person, even though he definitely argues hard, thereby annoying many. I also noted that Steve has been the victim of a purge at the University of Western Sydney, although I did not fully spell it out there. But he has. His department was eliminated as of the end of March, and there is little question that this was done specifically to get rid of him. One can dump on Steve Keen all one wants, but he is currently unemployed as a result of a vendetta by orthodox types in the Australian establishment against him, which is a scandal as far as I am concerned …

Rosser_lgThis led me to enter the fray, talking about how I had made predictions on the old Maxspeak about the housing bubble, but that these were not available due to the archives being sealed. Although these were initially inspired by Dean Baker and later supported by data from Shiller in his Irrational Exuberance, 2nd edition, Chap. 2 (did not mention that, but is true). I also noted that though I did not blog on it early, I gave talks about the link between the housing bubble and global financial markets, forecasting a major crash in March and December, 2007 (dates of speeches, I did not pick a time for the crash), with this insight coming from a Sept. 2006 speech by Timothy Geithner in Hong Kong, of all people. I then called that the crash was coming soon in a post here that was … based on a model of mine with Antonio Palestrini and Mauro Gallegati, published in Macroeconomic Dynamics in 2011, although the first draft was out in 2005. Nobody was interested in publishing an ABM of Minsky dynamics somehow prior to the crash. It was this model that Jamie cited when he mentioned me.

That model was more specifically of the three different kinds of crashes that can come out of a bubble according to Minsky initially and picked up by Charles Kindleberger, in his classic _Manias, Panics, and Crashes_: a sudden fall from the peak, a gradual decline from the peak with no hard crash, and then one with a ”period of financial distress” wherein there is a gradual decline for awhile after the peak, followed by a hard crash, which is by far the most common historical pattern according to Kindleberger. I said in July, 2008 that the global financial markets were in such a period of distress, which had started in Summer 2007, varying slightly depending on the market, and that it looked like a hard crash was coming probably pretty soon, the Minsky Moment, which indeed happend in mid-September after the failure of Lehman Brothers.

I have since with Gallegati and Marina Rosser published a paper in 2012 in the Journal of Economic Issues noting how other bubbles of the period fit into this. Housing looked like the gradual decline, roughly paralleling its rise, something one would expect more from real estate, particularly residential real estate, where people resist selling their homes when the price is falling, and oil, which in 2008 followed the sudden crash scenario from $147 per barrel in July to around $30 in November, with commodities often more likely to follow this scenario than the other two. I admitted in my comment at Noah’s that I did not anywhere forecast the depth of the recession or the length of the recovery.

Then deep in the debate, Robert Viennau linked to a paper from a couple of years ago by Jamie Galbraith entitled, ”Who Are These Economists Anyway?’  I do  not have a solid link to it, but it is linked to in another blogpost out yesterday by Lars Syll, http://rwer.wordpress.com/2013/06/04/bashing-crises-predictions .  Jamie’s piece mentions another set of candidates for ”who predicted it,” including Marxists Patrick Bond and Robert Brennan, Keynesians such as Wynne Godly and people at the Levy Institute, Minsky non-linearians, including the late Peter Albin, me, and Ping Chen, and then ”the new criminologists” who focus on insitutions and fraud, Gary Dymski and Bill Black, whom he saw as following his late father’s ideas.  (I don’t think he mentioned Steve, but maybe I just did not read closely enough.)

This would be the most recent post by Lars Syll, which on the one hand I appreciate, but on the other I think he does not quite have things right. He bashes Noah for ”bashing the predictors,” and then names five people: Dean Baker, Dirk Bezemer, Nouriel Roubini, me, and Steve Keen. Now, he certainly did bash Steve, but I do not think this characterizes what he said about the rest of us. He simply linked to Bezemer’s paper, but really did not comment on it at length other than to argue that none of those listed by Bezemer fulfilled his own criteria, even if all of them got at least parts of it right. He is not all that unfavorable to Roubini, although thinking that his mechanism for the recession was off, involving a crash of the dollar, when just the opposite happened at the Minsky Moment, with Bernanke doing a Fed save by buying up $600 billion of eurotrash to prop up the collapsing euro, which was quietly rolled over during the next six months into MBSs (Noah did not spell out all those last details; I did). Dean Baker he actually said good things about, mostly, even if Dean did not have a full model. And he really did not comment on my post other than to say that Dean was one of those who did not constantly beat his own drum. I am doing so here, although I have not done so super-duper often in the past. As it is, I have to say that Lars overdid how hard Noah came down on all of us, although he definitely came down hard on Steve Keen big time, and promised in some comments to come down hard on Peter Schiff some other time.

I shall close this by noting some quotes that Lars pulled from the end of Jamie’s paper, that he had written in 2000, doing his own ”claiming,” I suppose, :-). Just a few, and I think he is right on this. He speaks of ”a kind of Politburo of correct economic thinking” that rules the profession. ”They predict disaster where none occurs. They deny the possibility of events that then happen.” But, ”No one of them loses face, in the club, for having been wrong. No one is disinvited from presenting papers at later annual meetings. And still less is anyone from the outside invited in.” And this remains pretty much true to this day.

Barkley Rosser

Truth as replicability (wonkish)

6 juni, 2013 kl. 09:30 | Publicerat i Statistics & Econometrics | Kommentarer inaktiverade för Truth as replicability (wonkish)

Much of statistical practice is an effort to reduce or deny variation and uncertainty. The reduction is done through standardization, replication, and other practices of experimental design, with the idea being to isolate and stabilize the quantity being estimated and then average over many cases. Even so, however, uncertainty persists, and statistical hypothesis testing is in many ways an endeavor to deny this, by reporting binary accept/reject decisions.

Classical statistical methods produce binary statements, but there is no reason to assume that the world works that way. Expressions such as Type 1 error, Type 2 error, false positive, and so on, are based on a model in which the world is divided into real and non-real effects. To put it another way, I understand the general scientific distinction of real vs. non-real effects but I do not think this maps well into the mathematical distinction of θ=0 vs. θ≠0. Yes, there are some unambiguously true effects and some that are arguably zero, but I would guess that the challenge in most current research in psychology is not that effects are zero but that they vary from person to person and in different contexts.

4_1But if we do not want to characte-rize science as the search for true positives, how should we statistically model the process of scientific publication and discovery? An empirical approach is to identify scientific truth with replicability; hence, the goal of an experimental or observational scientist is to discover effects that replicate in future studies.

The replicability standard seems to be reasonable. Unfortunately … researchers in psychology (and, presumably, in other fields as well) seem to have no problem replicating and getting statistical significance, over and over again, even in the absence of any real effects of the size claimed by the researchers …

As a student many years ago, I heard about opportunistic stopping rules, the file drawer problem, and other reasons why nominal p-values do not actually represent the true probability that observed data are more extreme than what would be expected by chance. My impression was that these problems represented a minor adjustment and not a major reappraisal of the scientific process. After all, given what we know about scientists’ desire to communicate their efforts, it was hard to imagine that there were file drawers bulging with unpublished results.

More recently, though, there has been a growing sense that psychology, biomedicine, and other fields are being overwhelmed with errors (consider, for example, the generally positive reaction to the paper of Ioannidis, 2005). In two recent series of papers, Gregory Francis and Uri Simonsohn and collaborators have demonstrated too-good-to-be-true patterns of p-values in published papers, indicating that these results should not be taken at face value.

Andrew Gelman

John Maynard Keynes

5 juni, 2013 kl. 11:19 | Publicerat i Varia | Kommentarer inaktiverade för John Maynard Keynes

Today is the 130th birthday of the great economist:
 
keynes-john-maynard
 
John Maynard Keynes (5 June 1883 – 21 April 1946)

Paul Krugman – a Bastard Keynesian

5 juni, 2013 kl. 08:15 | Publicerat i Economics | Kommentarer inaktiverade för Paul Krugman – a Bastard Keynesian

Joan-Robinson-What-Are-The-Questions-And-Other-EssaysSome decades ago the British economist Joan Robinson – one of John Maynard Keynes’ most brilliant students who helped him with the original draft of his General Theory – half-jokingly referred to some of her colleagues as “Bastard Keynesians”. These colleagues were mostly American Keynesians, but there were a few British Bastard Keynesians too – such as John Hicks, who invented the now famous ISLM diagram. What Robinson was trying to say was that these so-called Keynesians were fatherless in the sense that they should not be recognised as legitimately belonging to the Keynesian family. The Bastard Keynesians, in turn, generally assumed that this criticism implied some sort of Keynesian fundamentalism on the part of the British school. They seemed to assume that Robinson and her colleagues were just being obscurantist snobs.

Such a misinterpretation exists to this day. The second and third generation Bastard Keynesians – which include many of those who generally collect under the title “New Keynesian” – have reinforced this criticism. Paul Krugman, for example, in response from criticisms that he was misrepresenting the work of Keynes and his follower Hyman Minsky wrote:

”So, first of all, my basic reaction to discussions about What Minsky Really Meant — and, similarly, to discussions about What Keynes Really Meant — is, I Don’t Care. I mean, intellectual history is a fine endeavor. But for working economists the reason to read old books is for insight, not authority; if something Keynes or Minsky said helps crystallize an idea in your mind — and there’s a lot of that in both mens’ writing — that’s really good, but if where you take the idea is very different from what the great man said somewhere else in his book, so what? This is economics, not Talmudic scholarship.”

This is a classic misrepresentation of those who accuse Krugman and his ilk of Bastard Keynesianism. When people accuse Krugman and others of distorting the work of others it is not because of some sort of sacredness of the original text, but instead because Bastard Keynesianism is racked with internal inconsistencies that its adherents cannot recognise because, blinded as they are by their neoclassical prejudices, they never get beyond a shallow reading of actual Keynesian economics. What is more, these inconsistencies are not simply some sort of obscure doctrinal or theoretical nuance that only matters to hard-core theorists; rather they generate concrete policy responses that may well cause a great deal of trouble and, quite possibly, discredit Keynesian economics itself if and when they fail spectacularly should they be implemented.

Philip Pilkington

Predicting crises presupposes a theory where they are possible

3 juni, 2013 kl. 21:37 | Publicerat i Economics | 4 kommentarer

Really, Noah Smith? The best you can do is go after Steve Keen for failing to successfully predict when, where, and how the crash of 2007-08 would break out?

DFRNow, maybe Keen deserves a bit of stick for loudly proclaiming more “loudly and confidently than just about anyone else on the planet” that he predicated the global financial crisis. Perhaps that’s a bit brash.

But mainstream economists are the ones who dominate economic discourse. And they’re the ones who claim the scientificity of their approach to economic analysis is based not on the realism of their assumptions but on the predictive power of their models. And, finally, they’re the ones who, with few exceptions … failed to predict the more recent crisis.

At least Keen and other heterodox economists use theories that contain the possibility of crises occurring based on the endogenous tendencies of capitalist development … Mainstream economists don’t even admit of that possibility, although Smith has shown that at least a few of them have been able to successfully recalibrate one of their models (by adding financial frictions) and then to have successfully predicted the crisis—AFTER THE FACT.

Well, that simply doesn’t cut it. Either admit that mainstream economics is a failure because it didn’t successfully predict the crisis or give up on the idea that predictive power is one of the key criteria of economics, which has served as an excuse for attempting to demonstrate that what mainstream economists are doing is science and what the rest of us are doing is non-science. You just can’t have it both ways.

David F. Ruccio

Minsky – still – shows us the way out of the crises

2 juni, 2013 kl. 21:31 | Publicerat i Economics | Kommentarer inaktiverade för Minsky – still – shows us the way out of the crises

 
71Hk8PtCe3L._SL1360_Although Hyman P. Minsky (1919–1996) is best known for his ideas about financial instability, he was equally concerned with the question of how to create a stable economy that puts an end to poverty for all who are willing and able to work. This collection of Minsky’s writing spans almost three decades of his published and previously unpublished work on the necessity of combating poverty through full employment policies—through job creation, not welfare.

Bashing crises predictions

2 juni, 2013 kl. 18:59 | Publicerat i Economics | 9 kommentarer

Noah Smith has a post up on his blog questioning that people like Dean Baker, Dirk Bezemer, Nouriel Roubini, Barkley Rosser and in particular Steve Keen really – in any essential meaning of the word – ”predicted” the latest financial-economic crisis, the one that we are still living through (that mainstream economists didn’t, we know). It makes me come to think of (wonder why …) what James K. Galbraith wrote a couple of years ago in The NEA Higher Education Journal:

imagesLeading active members of today’s economics profession… have formed themselves into a kind of Politburo for correct economic thinking. As a general rule—as one might generally expect from a gentleman’s club—this has placed them on the wrong side of every important policy issue, and not just recently but for decades. They predict disaster where none occurs. They deny the possibility of events that then happen. … They oppose the most basic, decent and sensible reforms, while offering placebos instead. They are always surprised when something untoward (like a recession) actually occurs. And when finally they sense that some position cannot be sustained, they do not reexamine their ideas. They do not consider the possibility of a flaw in logic or theory. Rather, they simply change the subject. No one loses face, in this club, for having been wrong. No one is disinvited from presenting papers at later annual meetings. And still less is anyone from the outside invited in.

This remains the essential problem. As I have documented—and only in part— there is a considerable, rich, promising body of economics, theory and evidence, entirely suited to the study of the real economy and its enormous problems. This work is significant in ways in which the entire corpus of mainstream economics—including recent fashions like the new “behavioral economics”— simply is not. But where is it inside the economics profession? Essentially, nowhere.
It is therefore pointless to continue with conversations centered on the conventional economics. The urgent need is instead to expand the academic space and the public visibility of ongoing work that is of actual value when faced with the many deep problems of economic life in our time. It is to make possible careers in those areas, and for people with those perspectives, that have been proven wor- thy by events. This is—obviously—not a matter to be entrusted to the economics departments themselves. It is an imperative, instead, for university administrators, for funding agencies, for foundations, and for students and perhaps their parents. The point is not to argue endlessly with Tweedledum and Tweedledee. The point is to move past them toward the garden that must be out there, that in fact is out there, somewhere.

JB Educations och friskolesveket

2 juni, 2013 kl. 00:17 | Publicerat i Education & School | 5 kommentarer

I Sverige år 2013 låter vi friskolekoncerner med undermålig verksamhet få plocka ut skyhöga vinster – vinster som den svenska staten gladeligen låter dessa koncerner ta av vår skattefinansierade skolpeng. JB Educations är bara en i raden av smarta välfärdsplundrare. Dessa friskolekoncerner har överlag en högre lönsamhet än närings­livet i sin helhet, men när man väl plundrat färdigt lämnar man över problemen och eleverna till den förkättrade offentliga sektorn.

Tyvärr är denna skandalösa misshushållning med våra skattemedel ingalunda något nytt och det har under senare år föga förvånande också kommit en jämn ström av krav på ökad kontroll, tuffare granskning och inspektioner.

lipsillMen vänta lite nu! Var det inte så att när man på 1990-talet påbörjade systemskiftet inom skola och välfärd ofta anförde som argument för privatisering-arna att man just skulle slippa den byråkratiska logikens kostnader i form av regelverk, kontroller och uppföljningar? Konkurrensen – denna marknadsfundamentalismens panacé – skulle ju göra driften effektivare och höja verksamheternas kvalitet. Marknadslogiken skulle tvinga bort de ”byråkratiska” och tungrodda offentliga verksamheterna och kvar skulle bara finnas de bra företagen som ”valfriheten” möjliggjort.

Och nu när den panglossianska privatiseringsvåtdrömmen visar sig vara en mardröm så ska just det som man ville bli av med – regelverk och ”byråkratisk” tillsyn och kontroll – vara lösningen?

Man tar sig för pannan!

För ska man genomföra de åtgärdspaket som förs fram undrar man ju hur det går med den där effektivitetsvinsten. Kontroller, uppdragsspecifikationer, inspektioner m m kostar ju pengar och hur mycket överskott blir det då av privatiseringarna när dessa kostnader också ska räknas hem i kostnads-intäktsanalysen? Och hur mycket värd är den där ”valfriheten” när vi ser hur den gång på gång bara resulterar i verksamhet där vinst genereras genom kostnadsnedskärningar och sänkt kvalitet?

Det finns en uppenbar fara i att basera ersättningssystem på enkla objektiva mått när det vi vill ersätta i själva verket har flera och komplexa dimensioner, exempelvis ersättning efter antal utskrivna patienter, lärarlöner kopplade till betyg eller dylikt. Ofta har utbildnings- och vårdverksamheter denna karaktär av ”fleruppgiftsverkamhet” och då fungerar ofta inte incitamentkontrakt eller provisioner. I sådana fall kan ”byråkratier” vara mer ändamålsenliga än marknader.

Effektiv resursanvändning kan aldrig vara ett mål i sig. Däremot kan det vara ett nödvändigt medel för att nå uppsatta mål. Vår gemensamma välfärd är därför i grunden inte bara en fråga om ekonomisk effektivitet, utan också om våra föreställningar om ett värdigt liv, rättvisa och lika behandling.

Så grundfrågan är inte om skattefinansierade friskolor och vårdföretag  ska få göra vinstuttag eller om det krävs hårdare tag i form av kontroll och inspektion. Grundfrågan är om det är marknadens och privatiseringarnas logik som ska få styra våra välfärdsinrättningar – ska skattefinansierad skola och välfärd styras av demokrati och politik eller av marknaden.

Vi tror oss idag veta att kommersiellt drivna friskolor driver på olika former av etnisk och social segregation, påfallande ofta har låg lärartäthet och dåliga skolresultat, och i grund och botten sviker resurssvaga elever. Att dessa verksamheter ska premieras med att få plocka ut vinster på våra skattepengar är djupt stötande.

I ett samhälle präglat av jämlikhet, solidaritet och demokrati borde det vara självklart att skattefinansierade skolor inte ska få drivas med vinstintressen.

De nyligen presenterade förslagen från Friskolekommittén räcker därför inte. Som Jonas Vlachos så riktigt skriver :

I stort sett innebär alltså Friskolekommitténs förslag att saker och ting får fortgå som tidigare. Tyvärr, kan jag tycka, valde kommittén att bortse från möjligheten att reglera bort de incitament till tveksamt beteende som obegränsat vinstuttag för med sig. Istället lär den detaljreglering av skolans verksamhet som jag diskuterat här fortsätta.

A quick refresher on the Central Limit Theorem (student stuff)

1 juni, 2013 kl. 09:04 | Publicerat i Statistics & Econometrics | 1 kommentar

 

« Föregående sida

Blogga med WordPress.com.
Entries och kommentarer feeds.