It’s high time to bury Milton Friedman’s natural rate hypotheis
31 Jan, 2018 at 17:23 | Posted in Economics | 1 CommentFifty years ago Milton Friedman wrote an (in)famous article arguing that (1) the natural rate of unemployment was independent of monetary policy and that (2) trying to keep the unemployment rate below the natural rate would only give rise to higher and higher inflation.
The hypothesis has always been controversial, and much theoretical and empirical work has questioned the real-world relevance of the ideas that unemployment really is independent of monetary policy and that there is no long-run trade-off between inflation and unemployment.
In the latest issue of Journal of Economic Perspectives there are three articles — by Greg Mankiw/Ricardo Reis, Robert Hall/Tom Sargent, and Olivier Blanchard — on Friedman’s natural rate hypothesis.
The first two articles are of the nowadays common Chicago-New Keynesian mumbo jumbo ilk and will not be further commented on here.
Although Blanchard has his doubts — after having played around with a ‘toy model’ and looked at the data — he lands on the following advice:
Where does this leave us? It would be good to have a sense of … the specific channels at work. The empirical part of this paper has shown that we are still far from it. Thus, the general advice must be that central banks should keep the natural rate hypothesis (extended to mean positive but low values of b and a) as their baseline, but keep an open mind and put some weight on the alternatives. For example, given the evidence on labor force participation and on the stickiness of inflation expectations presented earlier, I believe that there is a strong case, although not an overwhelming case, to allow U.S. output to exceed potential for some time, so as to reintegrate some of the workers who left the labor force during the last ten years.
My own view on the subject is that the natural rate hypothesis does not hold water simply because the relations it describes have never actually existed.
The only thing that amazes yours truly is that although this is pretty ‘common knowledge,’ so-called ‘New Keynesian’ macroeconomists still today use it — and its cousin the Phillips curve — as a fundamental building block in their models. Why? Because without it ‘New Keynesians’ have to give up their (again and again empirically falsified) neoclassical view of the long-run neutrality of money and the simplistic idea of inflation as an excess-demand phenomenon.
The natural rate hypothesis approach (NRH) is not only of theoretical interest. Far from it.
The real damage done is that policymakers that take decisions based on NRH models systematically implement austerity measures and kill off economic expansion. The unnecessary and costly unemployment that this self-inflicted and flawed illusion eventuates, is something its New Classical and ‘New Keynesian’ advocates should always be kept accountable for.
If the [NRH] and rational expectations are both true simultaneously, a plot of decade averages of inflation against unemployment should reveal a vertical line at the natural rate of unemployment … This prediction fails dramatically.
There is no tendency for the points to lie around a vertical line and, if anything, the long-run Phillips is upward sloping, and closer to being horizontal than vertical. Since it is unlikely that expectations are systematically biased over decades, I conclude that the [NRH] is false …
It is definitely time to bury the natural rate hypothesis!
Trump’s bogus claims on the economy
31 Jan, 2018 at 10:07 | Posted in Politics & Society | 1 Comment
Stat och politiker har svikit landets alla skolelever
30 Jan, 2018 at 19:53 | Posted in Education & School | 1 CommentSedan länge är det känt och påtalat att många av de utbildningar som bedrivs vid landets högskolor och universitet idag har en mager kost att leva på. Resultatet blir därefter – få lärarledda föreläsningar i rekordstora studentgrupper.
Till detta kommer explosionen av nya studentgrupper som går vidare till universitetsstudier. Detta är på ett sätt klart glädjande. Idag har vi lika många doktorander i vårt utbildningssystem som vi hade gymnasister på 1950-talet! Men denna utbildningsexpansion har tyvärr i mycket skett till priset av försämrade möjligheter för studenterna att tillgodogöra sig högskoleutbildningens kompetenskrav. Många har fallit till föga och sänkt kraven.
Tyvärr är de studenter vi får till universitet och högskolor över lag allt sämre rustade för sina studier. Omstruktureringen av skolan i form av decentralisering, avreglering och målstyrning har tvärtemot politiska utfästelser inte levererat. Den pålagda professionaliseringen av lärarkåren har snarare resulterat i deprofessionalisering i takt med att resurser minskat och uppgifter och ansvarsområden ökat.
I takt med den eftergymnasiala utbildningsexpansionen har en motsvarande kunskapskontraktion hos stora studentgrupper ägt rum. Den skolpolitik som lett till denna situation slår hårdast mot dem den utger sig för att värna – de med litet eller inget ”kulturkapital” i bagaget hemifrån.
Kanske är de här trenderna och problemen speciellt tydliga inom den del av vårt universitets- och högskoleväsende som ägnar sig åt lärarutbildning.
Lärarstudenter rekryteras i dag i allt större utsträckning från studieovana hem. Lärarstudenters betyg och resultat på högskoleprov har också minskat under en längre tid. Samtidigt som rekryteringen av lärarstudenter med höga studieresultat således försvårats, har det i allt högre grad ställts krav på lärarutbildningens akademiska nivå. Hur vi med knappare resursramar ska kunna lösa dilemmat med högre krav på meritmässigt allt mer svagpresterande studenter är svårt att se.
Enligt Högskoleverkets egen statistik har lärarprogrammen genomgående lågt söktryck med i genomsnitt knappt över en sökande per plats, vilket självklart återspeglas i betygskraven. Detta innebär också att studenter med högre betyg saknar utmaningar och söker sig till andra utbildningar.
Lärarnas relativlöner har minskat under en längre tid. För 50 år sedan tjänade en folkskollärare i genomsnitt nästan lika mycket som en ingenjör. Idag är en grundskolelärarlön i snitt 65 procent av en civilingenjörslön. För 50 år sedan tjänade en läroverkslärare i genomsnitt 35 procent mer än en ingenjör. Idag är en gymnasielärarlön i snitt 75 procent av en civilingenjörslön.
Den allmänna lärarlönenivån måste öka. Men detta är bara möjligt om kommunernas kamrerarattityd till skolan blir ett minne blott och staten också är beredd att satsa på det som på sikt ger högre tillväxt och välfärd i ett kunskapssamhälle – kunskap! Ingen kan ta del av modern utbildningspolitisk forskning utan att inse hur huvudlös de senaste decenniernas skolpolitik varit när det gäller dessa fundamenta. Skolans problem går i grunden inte att lösa utan att höja lärares relativlöner och ge dem drägliga arbetsvillkor. Detta är inte möjligt att uppnå med kommunalt huvudmannaskap. Historien förskräcker.
Egentligen är det märkligt att lärarlönetappet ohämmat fått fortgå så länge. Få åtgärder torde ha större långsiktig avkastning än att satsa på att få duktiga lärare som kan förmedla kunskaper till nästkommande generationer.
Här har vi så klart också ett av huvudskälen till de problem som svensk skola brottas med i dag. Varför skulle högpresterande studenter annat än undantagsvis välja att söka sig till en utbildning som leder in i ett yrke som idag kännetecknas av låg lön och avsaknad av all prestige och status? Före kommunaliseringen av skolan skulle man kanske kunna hävda att lärares arbetsvillkor delvis kunde kompensera för dessa brister. Men nu när fyra av tio yrkesverksamma lärare på grund av låga löner och bristande arbetsvillkor överväger att byta till arbete utanför skolan, finns inga sådana motverkande kompensationer.
I detta läge behövs rejäla mediciner. Och då räcker tyvärr inte – i och för sig vällovliga — åtgärder som införande av lärarlegitimering och fler gymnasielektorer. Anledningen är helt enkelt att dessa åtgärder inte berör de fundamentala problem som jag här har berört.
Låt oss tala och tala tydligt! Det vi idag kan se av kommunaliseringens och friskolornas konsekvenser borde leda till krav på att staten tar ett större ansvar för svensk skola. Om förstatligande är svårt att svälja, borde man åtminstone kunna återgå till det system av öronmärkta statliga pengar till skolan som fanns fram till år 1993. Och ge lärare bättre lön och arbetsvillkor så kommer också bättre studenter att söka sig till lärarutbildningarna. Först då kan vi få en skola som är bäst i klassen.
Kjell-Olof Feldts och finansdepartementets ekonomistiska perspektiv var tillsammans med Ingvar Carlssons och Göran Perssons vision om en kommunaliserad skola med om att på 1980-talet driva fram ett beslut som flera socialdemokrater idag erkänner har resulterat i att en lång svensk tradition inom arbetarrörelsen kring skapande av en likvärdig skola idag fullständigt kapsejsat.
Om jag inte missminner mig var det en viss finansminister som satt i riksdagen för trettiofem år sedan och rimmade på temat ”ett jävla skit” som man ”baxat ända hit”. Samma sak kan sägas om kommunaliseringen av svensk skola. Det är dags att sätta stopp nu. Den kommunaliserade skolan har baxat färdigt!
The problem with charter schools? They don’t work!
30 Jan, 2018 at 16:58 | Posted in Education & School | Comments Off on The problem with charter schools? They don’t work!
Trump’s Nixon Moment getting closer
30 Jan, 2018 at 13:33 | Posted in Politics & Society | Comments Off on Trump’s Nixon Moment getting closer
What is (wrong with) neoclassical economics?
30 Jan, 2018 at 10:30 | Posted in Economics | 1 CommentIf you want to know what is “neoclassical economics” and turn to Wikipedia you are told that
neoclassical economics is a term variously used for approaches to economics focusing on the determination of prices, outputs, and income distributions in markets through supply and demand, often mediated through a hypothesized maximization of utility by income-constrained individuals and of profits by cost-constrained firms employing available information and factors of production, in accordance with rational choice theory.
The basic problem with this definition of neoclassical economics — arguing that the differentia specifica of neoclassical economics is its use of demand and supply, utility maximization and rational choice — is that it doesn’t get things quite right. As we all know, there is an endless list of mainstream models that more or less distance themselves from one or the other of these characteristics. So the heart of neoclassical economic theory lies elsewhere.
The essence of neoclassical economic theory is its almost exclusive use of a deductivist methodology. A methodology that is more or less used without a smack of argument to justify its relevance.
The theories and models that neoclassical economists construct describe imaginary worlds using a combination of formal sign systems such as mathematics and ordinary language. The descriptions made are extremely thin and to a large degree disconnected to the specific contexts of the targeted system than one (usually) wants to (partially) represent. This is not by chance. These closed formalistic-mathematical theories and models are constructed for the purpose of being able to deliver purportedly rigorous deductions that may somehow by be exportable to the target system. By analyzing a few causal factors in their “laboratories” they hope they can perform “thought experiments” and observe how these factors operate on their own and without impediments or confounders.
Unfortunately, this is not so. The reason for this is that economic causes never act in a socio-economic vacuum. Causes have to be set in a contextual structure to be able to operate. This structure has to take some form or other, but instead of incorporating structures that are true to the target system, the settings made in economic models are rather based on formalistic mathematical tractability. In the models they appear as unrealistic assumptions, usually playing a decisive role in getting the deductive machinery deliver “precise” and “rigorous” results. This, of course, makes exporting to real world target systems problematic, since these models – as part of a deductivist covering-law tradition in economics – are thought to deliver general and far-reaching conclusions that are externally valid. But how can we be sure the lessons learned in these theories and models have external validity, when based on highly specific unrealistic assumptions? As a rule, the more specific and concrete the structures, the less generalizable the results. Admitting that we in principle can move from (partial) falsehoods in theories and models to truth in real world target systems does not take us very far, unless a thorough explication of the relation between theory, model and the real world target system is made. If models assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. To have a deductive warrant for things happening in a closed model is no guarantee for them being preserved when applied to an open real world target system.
Henry Louis Mencken once wrote that “there is always an easy solution to every human problem – neat, plausible and wrong.” And neoclassical economics has indeed been wrong. Its main result, so far, has been to demonstrate the futility of trying to build a satisfactory bridge between formalistic-axiomatic deductivist models and real world target systems. Assuming, for example, perfect knowledge, instant market clearing and approximating aggregate behaviour with unrealistically heroic assumptions of representative actors, just will not do. The assumptions made, surreptitiously eliminate the very phenomena we want to study: uncertainty, disequilibrium, structural instability and problems of aggregation and coordination between different individuals and groups.
The punch line is that most of the problems that neoclassical economics is wrestling with, issues from its attempts at formalistic modeling per se of social phenomena. Reducing microeconomics to refinements of hyper-rational Bayesian deductivist models is not a viable way forward. It will only sentence to irrelevance the most interesting real world economic problems. And as someone has so wisely remarked, murder is unfortunately the only way to reduce biology to chemistry – reducing macroeconomics to Walrasian general equilibrium microeconomics basically means committing the same crime.
If scientific progress in economics – as Robert Lucas and other latter days neoclassical economists seem to think – lies in our ability to tell “better and better stories” without considering the realm of imagination and ideas a retreat from real world target systems reality, one would of course think our economics journal being filled with articles supporting the stories with empirical evidence. However, I would argue that the journals still show a striking and embarrassing paucity of empirical studies that (try to) substantiate these theoretical claims. Equally amazing is how little one has to say about the relationship between the model and real world target systems. It is as though thinking explicit discussion, argumentation and justification on the subject not required. Neoclassical economic theory is obviously navigating in dire straits.
If the ultimate criteria of success of a deductivist system is to what extent it predicts and coheres with (parts of) reality, modern neoclassical economics seems to be a hopeless misallocation of scientific resources. To focus scientific endeavours on proving things in models, is a gross misapprehension of what an economic theory ought to be about. Deductivist models and methods disconnected from reality are not relevant to predict, explain or understand real world economic target systems. These systems do not conform to the restricted closed-system structure the neoclassical modeling strategy presupposes.
Neoclassical economic theory still today consists mainly in investigating economic models. It has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in neoclassical economic theory, where models largely function as substitutes for empirical evidence.
What is wrong with neoclassical economics is not that it employs models per se, but that it employs poor models. They are poor because they do not bridge to the real world target system in which we live. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on mathematical deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.
Minimum wage à la Germany
30 Jan, 2018 at 09:16 | Posted in Economics | Comments Off on Minimum wage à la GermanyTrotz der auch im Jahr 2016 noch anhaltenden deutlichen Lohnsteigerungen im Niedriglohnbereich gibt es nach wie vor ein Problem mit weitverbreiteten Umgehungen des gesetzlichen Mindestlohns. Werden bei der Stundenlohn-Berechnung die vertragliche Arbeitszeit und bezahlte Überstunden des Vormonats zugrunde gelegt, erhalten 9,8% der Beschäftigten den Mindestlohn nicht, obwohl sie einen Anspruch darauf hätten. In absoluten Zahlen entspricht dies 2,7 Mio. abhängig Beschäftigten …
Besonders häufig wird der Mindestlohn im Hotel- und Gaststättengewerbe nicht eingehalten, wo ca. 38% der (anspruchsberechtigten) Beschäftigten nicht den Mindestlohn erhalten, gefolgt vom Einzelhandel mit ca. 20% Mindestlohn-Umgehungen (Abb. 2). Auch ca. 43% der Beschäftigten in privaten Haushalten (bspw. Minijobs für die Haushaltsführung oder zum Babysitting) bleibt der Mindestlohn verwehrt. Hier dürfte die Einhaltung des Mindestlohns mit am schwierigsten zu kontrollieren sein. Allerdings sind Mindestlohn-Umgehungen nicht auf den Dienstleistungssektor beschränkt. So ist die Quote z. B. in der Nahrungsmittelindustrie mit ca. 17% überdurchschnittlich hoch. Auch in der überwiegend durch Tarifbindung geprägten metallverarbeitenden Industrie erhalten ca. 7% der Beschäftigten nicht den Mindestlohn, der ihnen per Gesetz zusteht.
Truth and knowledge
28 Jan, 2018 at 11:50 | Posted in Varia | 1 Comment
The relation of knowledge to power is one not only of servility but of truth. Much knowledge, if out of proportion to the disposition of forces, is invalid, however formally correct it may be.If an émigré doctor says: ‘For me … is a patological case’, his pronouncement may ultimately be confirmed by clinical findings, but its incongruity with the objective calamity visited on the world in the name of that paranoiac renders the diagnosis ridiculous, mere professional preening. Perhaps … is ‘in-himself’ a pathological case, but certainly not ‘for-him’. The vanity and poverty of many of the declarations against … by émigrés is connected with this. People thinking in the forms of free, detached, disinterested appraisal were unable to accommodate with in those forms the experience of violence which in reality annuls such thinking. The almost insoluble task is to let neither the power of others, nor our own powerlessness, stupefy us.
Why are women paid less than men?
28 Jan, 2018 at 00:22 | Posted in Economics | Comments Off on Why are women paid less than men?
On probabilism and statistics
27 Jan, 2018 at 16:51 | Posted in Statistics & Econometrics | 1 Comment‘Mr Brown has exactly two children. At least one of them is a boy. What is the probability that the other is a girl?’ What could be simpler than that? After all, the other child either is or is not a girl. I regularly use this example on the statistics courses I give to life scientists working in the pharmaceutical industry. They all agree that the probability is one-half.
So they are all wrong. I haven’t said that the older child is a boy. The child I mentioned, the boy, could be the older or the younger child. This means that Mr Brown can have one of three possible combinations of two children: both boys, elder boy and younger girl, elder girl and younger boy, the fourth combination of two girls being excluded by what I have stated. But of the three combinations, in two cases the other child is a girl so that the requisite probability is 2/3 …
This example is typical of many simple paradoxes in probability: the answer is easy to explain but nobody believes the explanation. However, the solution I have given is correct.
Or is it? That was spoken like a probabilist. A probabilist is a sort of mathematician. He or she deals with artificial examples and logical connections but feel no obligation to say anything about the real world. My demonstration, however, relied on the assumption that the three combinations boy–boy, boy–girl and girl–boy are equally likely and this may not be true. The difference between a statistician and a probabilist is that the latter will define the problem so that this is true, whereas the former will consider whether it is true and obtain data to test its truth.
Statistical reasoning certainly seems paradoxical to most people.
Take for example the well-known Simpson’s paradox.
From a theoretical perspective, Simpson’s paradox importantly shows that causality can never be reduced to a question of statistics or probabilities, unless you are — miraculously — able to keep constant all other factors that influence the probability of the outcome studied.
To understand causality we always have to relate it to a specific causal structure. Statistical correlations are never enough. No structure, no causality.
Simpson’s paradox is an interesting paradox in itself, but it can also highlight a deficiency in the traditional econometric approach towards causality. Say you have 1000 observations on men and an equal amount of observations on women applying for admission to university studies, and that 70% of men are admitted, but only 30% of women. Running a logistic regression to find out the odds ratios (and probabilities) for men and women on admission, females seem to be in a less favourable position (‘discriminated’ against) compared to males (male odds are 2.33, female odds are 0.43, giving an odds ratio of 5.44). But once we find out that males and females apply to different departments we may well get a Simpson’s paradox result where males turn out to be ‘discriminated’ against (say 800 male apply for economics studies (680 admitted) and 200 for physics studies (20 admitted), and 100 female apply for economics studies (90 admitted) and 900 for physics studies (210 admitted) — giving odds ratios of 0.62 and 0.37).
Econometric patterns should never be seen as anything else than possible clues to follow. From a critical realist perspective it is obvious that behind observable data there are real structures and mechanisms operating, things that are — if we really want to understand, explain and (possibly) predict things in the real world — more important to get hold of than to simply correlate and regress observable variables.
Math cannot establish the truth value of a fact. Never has. Never will.
A night at the Roxbury
26 Jan, 2018 at 16:48 | Posted in Varia | Comments Off on A night at the Roxbury
Jordan Peterson lecturing an incompetent journalist on gender inequality
26 Jan, 2018 at 06:56 | Posted in Politics & Society | Comments Off on Jordan Peterson lecturing an incompetent journalist on gender inequalityWhen discussing things it is definitely good to know what you are talking about. A good rule that certainly also applies to journalists …
Paul Romer leaves the World Bank
25 Jan, 2018 at 11:00 | Posted in Economics | 1 CommentOutspoken chief economist Paul Romer is leaving the World Bank “effective immediately” after just 15 months on the job, the bank’s president told staff on Wednesday in an internal announcement seen by the Financial Times.
Mr Romer, one of the US’s most celebrated economists, had been engaged in a running battle with staff economists at the bank almost since his high-profile arrival in October 2016. Areas of dispute have included everything from Mr Romer’s diktats on grammar and brevity in reports to serious questions about methodology …
In a message to staff sent on Wednesday, Jim Yong Kim, the World Bank’s president, said Mr Romer had told him he had decided to step down and return to his position as a professor at New York University.
Romer is one of the pioneers of endogenous growth theory and has for years now been mentioned in discussions of possible candidates for the ‘Nobel Prize’ in economics.
After having delivered the following attack against mainstream economics establishment, his chances of receiving that price, however, hasn’t exactly increased …
A key part of the solution to the identification problem that Lucas and Sargent (1979) seemed to offer was that mathematical deduction could pin down some parameters in a simultaneous system. But solving the identification problem means feeding facts with truth values that can be assessed, yet math cannot establish the truth value of a fact. Never has. Never will.
In practice, what math does is let macroeconomists locate the FWUTVs [facts with unknown truth values] farther away from the discussion of identification … Relying on a micro-foundation lets an author say, “Assume A, assume B, … blah blah blah …. And so we have proven that P is true. Then the model is identified.” …
Distributional assumptions about error terms are a good place to bury things because hardly anyone pays attention to them. Moreover, if a critic does see that this is the identifying assumption, how can she win an argument about the true expected value the level of aether? If the author can make up an imaginary variable, “because I say so” seems like a pretty convincing answer to any question about its properties.
Blog at WordPress.com.
Entries and Comments feeds.