Statistical inference

30 January, 2014 at 15:40 | Posted in Statistics & Econometrics | Comments Off on Statistical inference

Sampling distributions are the key to understanding inferential statistics. Once you’ve grasped how we use sampling distributions for making hypothesis testing possible, well, then you’ve understood the most important part of the logic of statistical inference — and the rest is really just a piece of cake!

Advertisements

We Praise Thee

30 January, 2014 at 14:36 | Posted in Varia | Comments Off on We Praise Thee

 

Preferences that make economic models explode

30 January, 2014 at 08:45 | Posted in Economics | 4 Comments

Commenting on experiments — showing time-preferences-switching framing effects — performed by experimental economist David Eil, Noah Smith writes:

Now, here’s the thing…it gets worse … I’ve heard whispers that a number of researchers have done experiments in which choices can be re-framed in order to obtain the dreaded negative time preferences, where people actually care more about the future than the present! Negative time preferences would cause most of our economic models to explode, and if these preferences can be created with simple re-framing, then it bodes ill for the entire project of trying to model individuals’ choices over time.

valuefunctionThis matters a lot for finance research. One of the big questions facing finance researchers is why asset prices bounce around so much. The two most common answers are A) time-varying risk premia, and B) behavioral “sentiment”. But Eil’s result, and other results like it, could be bad news for both efficient-market theory and behavioral finance. Because if aggregate preferences themselves are unstable due to a host of different framing effects, then time-varying risk premia can’t be modeled in any intelligible way, nor can behavioral sentiment be measured. In other words, the behavior of asset prices may truly be inexplicable (since we can’t observe all the multitude of things that might cause framing effects).

It’s a scary thought to contemplate, but to dismiss the results of experiments like Eil’s would be a mistake! It may turn out that the whole way modern economics models human behavior is good only in some situations, and not in others.

Bad news indeed. But hardly new.

In neoclassical theory preferences are standardly expressed in the form of a utility function. But although the expected utility theory has been known for a long time to be both theoretically and descriptively inadequate, neoclassical economists all over the world gladly continue to use it, as though its deficiencies were unknown or unheard of.

What most of them try to do in face of the obvious theoretical and behavioural inadequacies of the expected utility theory, is to marginally mend it. But that cannot be the right attitude when facing scientific anomalies. When models are plainly wrong, you’d better replace them! As Matthew Rabin and Richard Thaler have it in Risk Aversion:

It is time for economists to recognize that expected utility is an ex-hypothesis, so that we can concentrate our energies on the important task of developing better descriptive models of choice under uncertainty.

In his modern classic Risk Aversion and Expected-Utility Theory: A Calibration Theorem Matthew Rabin  writes:

Using expected-utility theory, economists model risk aversion as arising solely because the utility function over wealth is concave. This diminishing-marginal-utility-of-wealth theory of risk aversion is psychologically intuitive, and surely helps explain some of our aversion to large-scale risk: We dislike vast uncertainty in lifetime wealth because a dollar that helps us avoid poverty is more valuable than a dollar that helps us become very rich.

Yet this theory also implies that people are approximately risk neutral when stakes are small … While most economists understand this formal limit result, fewer appreciate that the approximate risk-neutrality prediction holds not just for negligible stakes, but for quite sizable and economically important stakes. Economists often invoke expected-utility theory to explain substantial (observed or posited) risk aversion over stakes where the theory actually predicts virtual risk neutrality.While not broadly appreciated, the inability of expected-utility theory to provide a plausible account of risk aversion over modest stakes has become oral tradition among some subsets of researchers, and has been illustrated in writing in a variety of different contexts using standard utility functions …

Expected-utility theory is manifestly not close to the right explanation of risk attitudes over modest stakes. Moreover, when the specific structure of expected-utility theory is used to analyze situations involving modest stakes — such as in research that assumes that large-stake and modest-stake risk attitudes derive from the same utility-for-wealth function — it can be very misleading.

In a similar vein, Daniel Kahneman writes — in Thinking, Fast and Slow — that expected utility theory is seriously flawed since it doesn’t take into consideration the basic fact that people’s choices are influenced by changes in their wealth. Where standard microeconomic theory assumes that preferences are stable over time, Kahneman and other behavioural economists have forcefully again and again shown that preferences aren’t fixed, but vary with different reference points. How can a theory that doesn’t allow for people having different reference points from which they consider their options have an almost axiomatic status within economic theory?

The mystery is how a conception of the utility of outcomes that is vulnerable to such obvious counterexamples survived for so long. I can explain it only by a weakness of the scholarly mind … I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking it is extraordinarily difficult to notice its flaws … You give the theory the benefit of the doubt, trusting the community of experts who have accepted it … But they did not pursue the idea to the point of saying, “This theory is seriously wrong because it ignores the fact that utility depends on the history of one’s wealth, not only present wealth.”

The works of people like Rabin, Thaler and Kahneman, show that expected utility theory is indeed transmogrifying truth. It’s an “ex-hypthesis”  — or as Monty Python has it:

ex-ParrotThis parrot is no more! He has ceased to be! ‘E’s expired and gone to meet ‘is maker! ‘E’s a stiff! Bereft of life, ‘e rests in peace! If you hadn’t nailed ‘im to the perch ‘e’d be pushing up the daisies! ‘Is metabolic processes are now ‘istory! ‘E’s off the twig! ‘E’s kicked the bucket, ‘e’s shuffled off ‘is mortal coil, run down the curtain and joined the bleedin’ choir invisible!! THIS IS AN EX-PARROT!!

Encounters with R. A. Fisher

29 January, 2014 at 19:46 | Posted in Statistics & Econometrics | Comments Off on Encounters with R. A. Fisher

 

Is calibration really a scientific advance? I’ll be dipped!

29 January, 2014 at 12:47 | Posted in Economics | Comments Off on Is calibration really a scientific advance? I’ll be dipped!

Noah Smith had a post up yesterday lamenting Nobel laureate Ed Prescott:

The 2004 prize went partly to Ed Prescott, the inventor of Real Business Cycle theory. That theory assumes that monetary policy doesn’t have an effect on GDP. Since RBC theory came out in 1982, a number of different people have added “frictions” to the model to make it so that monetary policy does have real effects. But Prescott has stayed true to the absolutist view that no such effects exist. In an email to a New York Times reporter, he very recently wrote the following:

“It is an established scientific fact that monetary policy has had virtually no effect on output and employment in the U.S. since the formation of the Fed,” Professor Prescott, also on the faculty of Arizona State University, wrote in an email. Bond buying [by the Fed], he wrote, “is as effective in bringing prosperity as rain dancing is in bringing rain.”

Wow! Prescott definitely falls into the category of people whom Miles Kimball and I referred to as “purist” Freshwater macroeconomists. Prescott has made some…odd…claims in recent years, but these recent remarks were totally consistent with his prize-winning research.

Odd claims indeed. True. There are many kinds of useless economics held in high regard within mainstream economics establishment today. Few – if any – are less deserved than the macroeconomic theory/method — mostly connected with Nobel laureates Finn Kydland, Robert Lucas, Edward Prescott and Thomas Sargent — called calibration.

fraud-kit

Interviewed by Seppo Honkapohja and Lee Evans — in  Macroeconomic Dynamics (2005, vol. 9) — Thomas Sargent answered the question if calibration was an advance in macroeconomics:

In many ways, yes … The unstated case for calibration was that it was a way to continue the process of acquiring experience in matching rational expectations models to data by lowering our standards relative to maximum likelihood, and emphasizing those features of the data that our models could capture. Instead of trumpeting their failures in terms of dismal likelihood ratio statistics, celebrate the features that they could capture and focus attention on the next unexplained feature that ought to be explained. One can argue that this was a sensible response… a sequential plan of attack: let’s first devote resources to learning how to create a range of compelling equilibrium models to incorporate interesting mechanisms. We’ll be careful about the estimation in later years when we have mastered the modelling technology…

But is the Lucas-Kydland-Prescott-Sargent calibration really an advance?

Let’s see what two eminent econometricians have to say. In Journal of Economic Perspective (1996, vol. 10) Lars Peter Hansen and James J. Hickman writes:

It is only under very special circumstances that a micro parameter such as the inter-temporal elasticity of substitution or even a marginal propensity to consume out of income can be ‘plugged into’ a representative consumer model to produce an empirically concordant aggregate model … What credibility should we attach to numbers produced from their ‘computational experiments’, and why should we use their ‘calibrated models’ as a basis for serious quantitative policy evaluation? … There is no filing cabinet full of robust micro estimats ready to use in calibrating dynamic stochastic equilibrium models … The justification for what is called ‘calibration’ is vague and confusing.

Error-probabilistic statistician Aris Spanos — in  Error and Inference (Mayo & Spanos, 2010, p. 240) — is no less critical:

Given that “calibration” purposefully foresakes error probabilities and provides no way to assess the reliability of inference, how does one assess the adequacy of the calibrated model? …

The idea that it should suffice that a theory “is not obscenely at variance with the data” (Sargent, 1976, p. 233) is to disregard the work that statistical inference can perform in favor of some discretional subjective appraisal … it hardly recommends itself as an empirical methodology that lives up to the standards of scientific objectivity

And this is the verdict of Paul Krugman :

The point is that if you have a conceptual model of some aspect of the world, which you know is at best an approximation, it’s OK to see what that model would say if you tried to make it numerically realistic in some dimensions.

But doing this gives you very little help in deciding whether you are more or less on the right analytical track. I was going to say no help, but it is true that a calibration exercise is informative when it fails: if there’s no way to squeeze the relevant data into your model, or the calibrated model makes predictions that you know on other grounds are ludicrous, something was gained. But no way is calibration a substitute for actual econometrics that tests your view about how the world works.

In physics it may possibly not be straining credulity too much to model processes as ergodic – where time and history do not really matter – but in social and historical sciences it is obviously ridiculous. If societies and economies were ergodic worlds, why do econometricians fervently discuss things such as structural breaks and regime shifts? That they do is an indication of the unrealisticness of treating open systems as analyzable with ergodic concepts.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. Reading Sargent and other calibrationists one comes to think of Robert Clower’s apt remark that

much economics is so far removed from anything that remotely resembles the real world that it’s often difficult for economists to take their own subject seriously.

Instead of assuming calibration and rational expectations to be right, one ought to confront the hypothesis with the available evidence. It is not enough to construct models. Anyone can construct models. To be seriously interesting, models have to come with an aim. They have to have an intended use. If the intention of calibration and rational expectations  is to help us explain real economies, it has to be evaluated from that perspective. A model or hypothesis without a specific applicability is not really deserving our interest.

To say, as Edward Prescott that

one can only test if some theory, whether it incorporates rational expectations or, for that matter, irrational expectations, is or is not consistent with observations

is not enough. Without strong evidence all kinds of absurd claims and nonsense may pretend to be science. We have to demand more of a justification than this rather watered-down version of “anything goes” when it comes to rationality postulates. If one proposes rational expectations one also has to support its underlying assumptions. None is given, which makes it rather puzzling how rational expectations has become the standard modeling assumption made in much of modern macroeconomics. Perhaps the reason is, as Paul Krugman has it, that economists often mistake

beauty, clad in impressive looking mathematics, for truth.

But I think Prescott’s view is also the reason why calibration economists are not particularly interested in empirical examinations of how real choices and decisions are made in real economies. In the hands of Lucas, Prescott and Sargent, rational expectations has been transformed from an – in principle – testable hypothesis to an irrefutable proposition. Irrefutable propositions may be comfortable – like religious convictions or ideological dogmas – but it is not  science.

Or as Noah Smith puts it:

But OK, suppose for a moment – just imagine – that somewhere, on some other planet, there was a group of alien macroeconomists who made a bunch of theories that were completely wrong, and were not even close to anything that could actually describe the business cycles on that planet. And suppose that the hypothetical aliens kept comparing their nonsense theories to data, and they kept getting rejected by the data, but the aliens still found the nonsense theories very cool and very a prioriconvincing, and they kept at it, finding “puzzles”, estimating parameter values, making slightly different nonsense models, etc., in a neverending cycle of brilliant non-discovery.

Now tell me: In principle, how should those aliens tell the difference between their situation, and our own? That’s the question that I think we need to be asking, and that a number of people on the “periphery” of macro are now asking out loud.

Det avreglerade järnvägssystemet har havererat — tillsätt en kriskommission!

28 January, 2014 at 12:41 | Posted in Politics & Society | 2 Comments

Yours truly har idag — tillsammans med bl. a. Jan Du Rietz, Sven Jernberg och Hans Albin Larsson — en artikel i Götborgs-Posten om det svenska järnvägseländet:

L-TRAFIKVERKET2_0 Staten har ”satsat” ett par miljarder kronor i dagens penningvärde på att lägga ned och riva banor, triangelspår och mötesspår, vilket minskat järnvägens kapacitet och gett ett stelt och störningskänsligt trafiksystem. Det har ihop med ofta misskötta och olämpliga tåg och otillräcklig redundans i tekniska hjälpsystem gett inställda och försenade tåg vilket blir kostsamt för resenärerna, näringslivet och samhället. Alla problem gör dessutom att allt fler kunder flyr tågtrafiken.

Bildandet av Trafikverket hotar bli dödsstöten för järnvägen. Verket har fått ett för en myndighet omöjligt politiskt uppdrag när det gäller att välja mellan väg och järnväg. Trafikverket har inte heller uppgiften att restaurera järnvägssystemet och saknar dessutom kompetens för detta.

En kriskommission med extraordinära befogenheter som kan koncentrera sig på järnvägens problem krävs. Regeringen bör bland annat kalla in experter från fungerande ”järnvägsländer” för att klara en nödvändig omstrukturering av hela järnvägssystemet.

Ode à la patrie

26 January, 2014 at 16:28 | Posted in Varia | 2 Comments

 

The failure of DSGE macroeconomics

24 January, 2014 at 10:56 | Posted in Economics | 5 Comments

As 2014 begins, it’s clear enough that any theory in which mass unemployment or (in the US case) withdrawal from the labour force can only occur in the short run is inconsistent with the evidence. Given that unions are weaker than they have been for a century or so, and that severe cuts to social welfare benefits have been imposed in most countries, the traditional rightwing explanation that labour market inflexibility [arising from minimum wage laws or unions], is the cause of unemployment, appeals only to ideologues (who are, unfortunately, plentiful) …

wrong-tool-by-jerome-awAfter the Global Financial Crisis, it became clear that the concessions made by the New Keynesians were ill-advised in both theoretical and political terms. In theoretical terms, the DSGE models developed during the spurious “Great Moderation” were entirely inconsistent with the experience of the New Depression. The problem was not just a failure of prediction: the models simply did not allow for depressions that permanently shift the economy from its previous long term growth path. In political terms, it turned out that the seeming convergence with the New Classical school was an illusion. Faced with the need to respond to the New Depression, most of the New Classical school retreated to pre-Keynesian positions based on versions of Say’s Law (supply creates its own demand) that Say himself would have rejected, and advocated austerity policies in the face of overwhelming evidence that they were not working …

Relative to DSGE, the key point is that there is no unique long-run equilibrium growth path, determined by technology and preferences, to which the economy is bound to return. In particular, the loss of productive capacity, skills and so on in the current depression is, for all practical purposes, permanent. But if there is no exogenously determined (though maybe still stochastic) growth path for the economy, economic agents (workers and firms) can’t make the kind of long-term plans required of them in standard life-cycle models. They have to rely on heuristics and rules of thumb … This is, in my view, the most important point made by post-Keynesians and ignored by Old Old Keynesians.

John Quiggin/Crooked Timber

On DSGE and the art of using absolutely ridiculous modeling assumptions

23 January, 2014 at 23:08 | Posted in Economics, Theory of Science & Methodology | 4 Comments

Reading some of the comments — by Noah Smith, David Andolfatto and others — on my post Why Wall Street shorts economists and their DSGE models, I — as usual — get the feeling that mainstream economists when facing anomalies think that there is always some further “technical fix” that will get them out of the quagmire. But are these elaborations and amendments on something basically wrong really going to solve the problem? I doubt it. Acting like the baker’s apprentice who, having forgotten to add yeast to the dough, throws it into the oven afterwards, simply isn’t enough.

When criticizing the basic workhorse DSGE model for its inability to explain involuntary unemployment, some DSGE defenders maintain that later elaborations — e.g. newer search models — manage to do just that. I strongly disagree. One of the more conspicuous problems with those “solutions,” is that they — as e.g. Pissarides’ ”Loss of Skill during Unemployment and the Persistence of Unemployment Shocks” QJE (1992) — are as a rule constructed without seriously trying to warrant that the model immanent assumptions and results are applicable in the real world. External validity is more or less a non-existent problematique sacrificed on the altar of model derivations. This is not by chance. For how could one even imagine to empirically test assumptions such as Pissarides’ ”model 1″ assumptions of reality being adequately represented by ”two overlapping generations of fixed size”, ”wages determined by Nash bargaining”, ”actors maximizing expected utility”,”endogenous job openings”, ”jobmatching describable by a probability distribution,” without coming to the conclusion that this is — in terms of realism and relevance — nothing but nonsense on stilts?

The whole strategy reminds me not so little of the following little tale:

Time after time you hear people speaking in baffled terms about mathematical models that somehow didn’t warn us in time, that were too complicated to understand, and so on. If you have somehow missed such public displays of throwing the model (and quants) under the bus, stay tuned below for examples.
But this is far from the case – most of the really enormous failures of models are explained by people lying …
truth_and_lies
A common response to these problems is to call for those models to be revamped, to add features that will cover previously unforeseen issues, and generally speaking, to make them more complex.

For a person like myself, who gets paid to “fix the model,” it’s tempting to do just that, to assume the role of the hero who is going to set everything right with a few brilliant ideas and some excellent training data.

Unfortunately, reality is staring me in the face, and it’s telling me that we don’t need more complicated models.

If I go to the trouble of fixing up a model, say by adding counterparty risk considerations, then I’m implicitly assuming the problem with the existing models is that they’re being used honestly but aren’t mathematically up to the task.

If we replace okay models with more complicated models, as many people are suggesting we do, without first addressing the lying problem, it will only allow people to lie even more. This is because the complexity of a model itself is an obstacle to understanding its results, and more complex models allow more manipulation …

I used to work at Riskmetrics, where I saw first-hand how people lie with risk models. But that’s not the only thing I worked on. I also helped out building an analytical wealth management product. This software was sold to banks, and was used by professional “wealth managers” to help people (usually rich people, but not mega-rich people) plan for retirement.

We had a bunch of bells and whistles in the software to impress the clients – Monte Carlo simulations, fancy optimization tools, and more. But in the end, the bank’s and their wealth managers put in their own market assumptions when they used it. Specifically, they put in the forecast market growth for stocks, bonds, alternative investing, etc., as well as the assumed volatility of those categories and indeed the entire covariance matrix representing how correlated the market constituents are to each other.

The result is this: no matter how honest I would try to be with my modeling, I had no way of preventing the model from being misused and misleading to the clients. And it was indeed misused: wealth managers put in absolutely ridiculous assumptions of fantastic returns with vanishingly small risk.

Cathy O’Neil

Soros & the theory of reflexivity

23 January, 2014 at 10:17 | Posted in Theory of Science & Methodology | 2 Comments

jemThe Journal of Economic Methodology has published a special issue on the theory of reflexivity developed by George Soros.

The issue includes a new article by Soros and responses and critiques from 18 leading scholars in economics and the history and philosophy of science.

The issue can be accessed free of charge here.

Why Wall Street shorts economists and their DSGE models

22 January, 2014 at 11:15 | Posted in Economics | 34 Comments

Blogger Noah Smith recently did an informal survey to find out if financial firms actually use the “dynamic stochastic general equilibrium” models that encapsulate the dominant thinking about how the economy works. The result? Some do pay a little attention, because they want to predict the actions of central banks that use the models. In their investing, however, very few Wall Street firms find the DSGE models useful …

titanic-sinking-underwaterThis should come as no surprise to anyone who has looked closely at the models. Can an economy of hundreds of millions of individuals and tens of thousands of different firms be distilled into just one household and one firm, which rationally optimize their risk-adjusted discounted expected returns over an infinite future? There is no empirical support for the idea. Indeed, research suggests that the models perform very poorly …

Why does the profession want so desperately to hang on to the models? I see two possibilities. Maybe they do capture some deep understanding about how the economy works … More likely, economists find the models useful not in explaining reality, but in telling nice stories that fit with established traditions and fulfill the crucial goal of getting their work published in leading academic journals …

Knowledge really is power. I know of at least one financial firm in London that has a team of meteorologists running a bank of supercomputers to gain a small edge over others in identifying emerging weather patterns. Their models help them make good profits in the commodities markets. If economists’ DSGE models offered any insight into how economies work, they would be used in the same way. That they are not speaks volumes.

Mark Buchanan/Bloomberg

[h/t Jan Milch]

Splendid article!

The unsellability of DSGE — private-sector firms do not pay lots of money to use DSGE models — is a strong argument against DSGE. But it is not the most damning critique of it.

In the basic DSGE models the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its “equilibrium value,” the representative agent adjust her labour supply, so that when the real wage is higher than its “equilibrium value,” labour supply is increased, and when the real wage is below its “equilibrium value,” labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

The D WordAlthough this picture of unemployment as a kind of self-chosen optimality, strikes most people as utterly ridiculous, there are also, unfortunately, a lot of neoclassical economists out there who still think that price and wage rigidities are the prime movers behind unemployment. DSGE models basically explains variations in employment (and a fortiori output) with assuming nominal wages being more flexible than prices – disregarding the lack of empirical evidence for this rather counterintuitive assumption.

Lowering nominal wages would not  clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. It would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen as a general substitute for an expansionary monetary or fiscal policy. And even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

The classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong. Flexible wages would probably only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labour market.

Obviously it’s rather embarrassing that the kind of DSGE models “modern” macroeconomists use cannot incorporate such a basic fact of reality as involuntary unemployment. Of course, working with representative agent models, this should come as no surprise. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

To me, this — the inability to explain involuntary unemployment — is the most damning critique of DSGE.

Added 23:00 GMT: Paul Davidson writes in a lovely comment on this article:

In explaining why Samuelson’s “old” neoclassical synthesis Keynesianism and New Keynesianism theories have nothing to do with Keynes’s General Theory of Employment, I have continually quoted Keynes [from page 257 of The General Theory] who wrote “For the Classical Theory has been accustomed to rest the supposedly self-adjusting character of the economic system on the assumed fluidity of money-wages; and when there is rigidity, to lay on this rigidity the blame for maladjustment … My difference from this theory is primarily a difference of analysis”. This is in a chapter entitled “Changes in Money Wages” where Keynes explain why changes in money wages can not guarantee full employment.

When in a published debate with Milton Friedman in the JPKE – later published as a book entitled MILTON FRIEDMAN’S MONETARY FRAMEWORK: A DEBATE WITH HIS CRITICS I pointed out this chapter of the General Theory to Milton his response was that Davidson refers to many chapters in the back of the General Theory that have some interesting and relevant comments – but are not part of Keynes’s theory, while fixity of wages and prices are essential to understanding Keynes.

In a verbal discussion with Paul Samuelson many years ago, I pointed out this chapter to Samuelson. His response was he found the General Theory “unpalatable” but liked the policy implications and therefore he [Samuelson] merely assumed the General Theory was a Walrasian system with fixity of wages and prices!

Neoklassiska fantasifoster

22 January, 2014 at 09:41 | Posted in Economics | Comments Off on Neoklassiska fantasifoster

mattpiskNågon ses piska en dammig matta med en stor pinne.

Av detta drar man slutsatsen att stora pinnar lämpar sig väl för att rengöra dammiga ting, som säg, fönster. De flesta skulle uppfatta slutsatsen som galen.

Varför?

Rimligen för att vi vet tillräckligt om hur pinnar och fönster är beskaffade, för att dels se problemet med den specifika rengöringstekniken, dels kunna laborera med alternativa möjligheter att uppnå målet: ett rent fönster.

Liknelsen ovan presenteras av den brittiska ekonomen Tony Lawson. Han fortsätter: Låt oss nu se framför oss en situation där pinne-rengör-fönster-modellen testas upprepade gånger, med produktion av glasbitar som följd. Vad om slutsatsen då blir att man måste ”försöka lite hårdare”.

Kanske har det varit fel fönster man har prövat med, kanske är man snart i mål. Anhängare av en sådan teori skulle onekligen kunna benämnas dogmatiker.

De skulle i varje fall inte få tjänstgöra som vaktmästare särskilt länge. Men detta är, konstaterar Lawson, dessvärre en fungerande analogi för tillståndet inom den samtida, ortodoxa nationalekonomin. Ja, utom detta med den sparkade vaktmästaren då.

Lawson skrev det här i mitten av 1990-talet. Sedan dess har ganska lite hänt. Det vill säga, när det gäller utgångspunkterna för den ekonomivetenskap som undervisas på universiteten, som dominerar den akademiska disciplinens kommandohöjder och som levererar premisser till politiska och administrativa beslut.

I verkligheten har ganska mycket hänt. Bland annat återkommande finans- och valutakriser runt om i världen, spekulativa ”innovationer” som bidrog till att blåsa upp bubblor som spruckit och skvätt ned hundratals miljoner människor, pågående ”krishanteringar” som puttar hela samhällen ned i långvarig fattigdom och stagnation.

Allt detta medan ledande ekonomer har skrivit ut recept byggda på fantasier om självkorrigerande marknader. Pinne möter fönster. Hela institutioner diskuterar swing-tekniker.

Ali Esbati/Magasinet Arena

Chant d’exil

21 January, 2014 at 13:14 | Posted in Varia | Comments Off on Chant d’exil

 

De döda skall inte tiga men tala

20 January, 2014 at 19:43 | Posted in Politics & Society, Varia | Comments Off on De döda skall inte tiga men tala
Till Fadime Sahindal, född 2 april 1975 i Turkiet, mördad 21 januari 2002 i Sverige

fadimeI Sverige har vi länge okritiskt omhuldat en ospecificerad och odefinierad mångkulturalism. Om vi med mångkulturalism menar att det i vårt samhälle finns flera olika kulturer, ställer detta inte till med problem. Då är vi alla mångkulturalister. Men om vi med mångkulturalism menar att det med kulturell tillhörighet och identitet också kommer specifika moraliska, etiska och politiska rättigheter och skyldigheter, talar vi om något helt annat. Då talar vi om normativ multikulturalism. Och att acceptera normativ mångkulturalism, innebär också att tolerera oacceptabel intolerans, eftersom den normativa mångkulturalismen innebär att specifika kulturella gruppers rättigheter kan komma att ges högre dignitet än samhällsmedborgarens allmänmänskliga rättigheter – och därigenom indirekt bli till försvar för dessa gruppers (eventuella) intolerans. I ett normativt mångkulturalistiskt samhälle kan institutioner och regelverk användas för att inskränka människors frihet utifrån oacceptabla och intoleranta kulturella värderingar.

Den normativa mångkulturalismen innebär att individer på ett oacceptabelt sätt reduceras till att vara passiva medlemmar av kultur- eller identitetsbärande grupper. Men tolerans innebär inte att vi måste ha en värderelativistisk inställning till identitet och kultur. De som i vårt samhälle i handling visar att de inte respekterar andra människors rättigheter, kan inte räkna med att vi ska vara toleranta mot dem.

Om vi ska värna om det moderna demokratiska samhällets landvinningar kan samhället inte omhulda en normativ mångkulturalism. I ett modernt demokratiskt samhälle måste rule of law gälla – och gälla alla!

Mot dem som i vårt samhälle vill tvinga andra att leva efter deras egna religiösa, kulturella eller ideologiska trosföreställningar och tabun, ska samhället vara intolerant. Mot dem som vill tvinga samhället att anpassa lagar och regler till den egna religionens, kulturens eller gruppens tolkningar, ska samhället vara intolerant.


DE DÖDA

De döda skall icke tiga men tala.
Förskingrad plåga skall finna sin röst,
och när cellernas råttor och mördarnas kolvar
förvandlats till aska och urgammalt stoft
skall kometens parabel och stjärnornas vågspel
ännu vittna om dessa som föll mot sin mur:
tvagna i eld men inte förbrunna till glöd,
förtrampade slagna men utan ett sår på sin kropp,
och ögon som stirrat i fasa skall öppnas i frid,
och de döda skall icke tiga men tala.

Om de döda skall inte tigas men talas.
Fast stympade strypta i maktens cell,
glasartade beledda i cyniska väntrum
där döden har klistrat sin freds propaganda,
skall de vila länge i samvetets montrar.
balsamerade av sanning och tvagna i eld,
och de som redan har stupat skall icke brytas,
och den som tiggde nåd i ett ögonblicks glömska
skall resa sig och vittna om det som inte brytes,
för de döda skall inte tiga men tala.

Nej, de döda skall icke tiga men tala.
De som kände triumf på sin nacke skall höja sitt huvud,
och de som kvävdes av rök skall se klart,
de som pinades galna skall flöda som källor,
de som föll för sin motsats skall själva fälla,
de som dräptes med bly skall dräpa med eld,
de som vräktes av vågor skall själva bli storm.
Och de döda skall icke tiga men tala.

                                           Erik Lindegren

The pernicious impact of the widening wealth gap

20 January, 2014 at 18:19 | Posted in Politics & Society | 1 Comment

The 85 richest people on the planet have accumulated as much wealth between them as half of the world’s population, political and financial leaders have been warned ahead of their annual gathering in the Swiss resort of Davos.

The tiny elite of multibillionaires, who could fit into a double-decker bus, have piled up fortunes equivalent to the wealth of the world’s poorest 3.5bn people, according to a new analysis by Oxfam. The charity condemned the “pernicious” impact of the steadily growing gap between a small group of the super-rich and hundreds of millions of their fellow citizens, arguing it could trigger social unrest.

inequality-cartoon2It released the research on the eve of the World Economic Forum, starting on Wednesday, which brings together many of the most influential figures in international trade, business, finance and politics including David Cameron and George Osborne. Disparities in income and wealth will be high on its agenda, along with driving up international health standards and mitigating the impact of climate change.

Oxfam said the world’s richest 85 people boast a collective worth of $1.7trn (£1trn). Top of the pile is Carlos Slim Helu, the Mexican telecommunications mogul, whose family’s net wealth is estimated by Forbes business magazine at $73bn. He is followed by Bill Gates, the Microsoft founder and philanthropist, whose worth is put at $67bn and is one of 31 Americans on the list.

Nigel Morris/The Independent

[h/t Jan Milch]

Walked-out Harvard economist and George Bush advisor  Greg Mankiw wrote an article last year on the “just desert” of the one percent, arguing that a market economy is some kind of moral free zone where, if left undisturbed, people get what they “deserve.”

This should come as no surprise. Most neoclassical economists actually have a more or less Panglossian view on unfettered markets. Add to that a neoliberal philosophy of a Robert Nozick or a David Gauthier, and you get Mankiwian nonsense on the growing inequality.

A society where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implodes. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed.

Top 30 Heterodox Economics Blogs

19 January, 2014 at 17:48 | Posted in Varia | 2 Comments

TOP30_SLIDER-470x260

I. Post Keynesian Blogs
(1) Debt Deflation, Steve Keen
http://www.debtdeflation.com/blogs/

(2) Post Keynesian Economics Study Group
http://www.postkeynesian.net/
This is not strictly a blog, but it is a great resource!

(3) Real-World Economics Review Blog
http://rwer.wordpress.com/

(4) Naked Keynesianism
http://nakedkeynesianism.blogspot.com/

(5) Lars P. Syll’s Blog
https://larspsyll.wordpress.com/
Lars P. Syll’s blog is an excellent resource, and the posts are wide-ranging and frequent.

(6) Philip Pilkington, Fixing the Economists
http://fixingtheeconomists.wordpress.com/
Philip Pilkington (of Nakedcapitalism.com) has started blogging here again. A great blog.

(7) Thoughts on Economics, Robert Vienneau
http://robertvienneau.blogspot.com/
Robert Vienneau’s blog has lots of advanced posts on Post Keynesianism economic theory.

(8) Unlearningeconomics Blog
http://unlearningeconomics.wordpress.com/
I believe “Unlearningeconomics” has wound down the blog recently, which is a pity because it was a great blog.

(9) Social Democracy for the 21st Century
http://socialdemocracy21stcentury.blogspot.com/

(10) Ramanan, The Case For Concerted Action
http://www.concertedaction.com/

(11) Yanis Varoufakis, Thoughts for the Post-2008 World
http://yanisvaroufakis.eu/

(12) Dr. Thomas Palley, PhD. in Economics (Yale University)
http://www.thomaspalley.com/
Unfortunately, Thomas Palley only has new posts infrequently, but it is a good read.

(13) Debtonation.org, Ann Pettifor blog
http://www.debtonation.org/

II. Modern Monetary Theory (MMT)/Neochartalism 
(14) Billy Blog, Bill Mitchell
http://bilbo.economicoutlook.net/blog/

(15) New Economic Perspectives
http://neweconomicperspectives.org/

(16) Mike Norman Economics Blog
http://mikenormaneconomics.blogspot.com/

(17) Warren Mosler, The Center of the Universe
http://moslereconomics.com/

(18) Centre of Full Employment and Equity (CofFEE)
http://e1.newcastle.edu.au/coffee/

III. Other Heterodox Blogs and Resources
(19) Prime, Policy Research in Macroeconomics
http://www.primeeconomics.org/

(20) Michael Hudson
http://michael-hudson.com/

(21) New Economics Foundation
http://www.neweconomics.org/

(22) Heteconomist.com
http://heteconomist.com/

(23) Econospeak Blog
http://econospeak.blogspot.com/

(24) James Galbraith
http://utip.gov.utexas.edu/JG/publications.html

(25) Robert Skidelsky’s Official Website
http://www.skidelskyr.com/

(26) The Other Canon
http://www.othercanon.org/

(27) Levy Economics Institute of Bard College
http://www.levyinstitute.org/

(28) Multiplier Effect, Levy Economics Institute Blog
http://www.multiplier-effect.org/

(29) John Quiggin
http://johnquiggin.com/

(30) The Progressive Economics Forum
http://www.progressive-economics.ca/

Lord Keynes

The Minsky moment of the Swedish housing bubble

17 January, 2014 at 20:20 | Posted in Economics | 1 Comment

The Swedish housing market and the Swedish economy have become much more fragile because of recent (post 1999) developments in the Swedish housing market. Houses have increasingly become assetized. Home ownership as well as mortgage-indebtedness of Swedish households has increased while the house price level (about +125%) increased dramatically compared with either the wage level (about +45% for an 1,33 jobs, two children family, courtesy of Eurostat) or the consumer price level (about +27%). This means that more households will experience larger declines in net worth when prices go down while the probability of such an occurrence has increased exactly because of the high level of prices and the increase in indebtedness of households. Which means that remarks of Lars Svensson that the Swedish housing market is sound because house prices can be explained by fundamentals are beside the point. It´s not about a bubble, yes or no. It´s all about fragility …

And fragility has increased. A ‘Minsky moment’ can change the relation between fundamentals (incomes, the price level, interest rates, demographics, taxes) and the amount which households are allowed to lend overnight …

When, after such a moment, house price decreases lower the perceived collateral value of other houses this can easily lead to a deflationary house price spiral which will have larger consequences when mortgage debt levels of households are higher and more households have mortgage debt. The Netherlands post Lehman are a perfect example … Dutch economists had an excuse. They had probably not yet read Minsky while the Reinhart and Rogoff  ‘this time is different’ book, which spells out the inherent historical instability of our monetary system, still had to appear. But Svensson has not. And surely not as such events have happened before and even as recently as around 1990, in Sweden.

Merijn Knibbe

On limiting model assumptions in econometrics (wonkish)

17 January, 2014 at 11:05 | Posted in Statistics & Econometrics | 3 Comments

In Andrew Gelman’s and Jennifer Hill’s Data Analysis Using Regression and Multilevel/Hierarchical Models, the authors list the assumptions of the linear regression model. On top of the list is validity and additivity/linearity, followed by different assumptions pertaining to error charateristics.

Yours truly can’t but concur, especially on the “decreasing order of importance” of the assumptions. But then, of course, one really has to wonder why econometrics textbooks — almost invariably — turn this order of importance upside-down and don’t have more thorough discussions on the overriding importance of Gelman/Hill’s two first points …

Since econometrics doesn’t content itself with only making “optimal predictions,” but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — and most important of these are validity and  additivity.

Let me take the opportunity to cite one of my favourite introductory statistics textbooks on one further reason these assumptions are made — and why they ought to be much more argued for on both epistemological and ontological grounds when used (emphasis added):

In a hypothesis test … the sample comes from an unknown population. If the population is really unknown, it would suggest that we do not know the standard deviation, and therefore, we cannot calculate the standard error. gravetterTo solve this dilemma, we have made an assumption. Specifically, we assume that the standard deviation for the unknown population (after treatment) is the same as it was for the population before treatment.

Actually this assumption is the consequence of a more general assumption that is part of many statistical procedure. The general assumption states that the effect of the treatment is to add a constant amount to … every score in the population … You should also note that this assumption is a theoretical ideal. In actual experiments, a treatment generally does not show a perfect and consistent additive effect.

A standard view among econometricians is that their models — and the causality they may help us to detect — are only in the mind. From a realist point of view, this is rather untenable. The reason we as scientists are interested in causality is that it’s a part of the way the world works. We represent the workings of causality in the real world by means of models, but that doesn’t mean that causality isn’t a fact pertaining to relations and structures that exist in the real world. If it was only “in the mind,” most of us couldn’t care less.

The econometricians’ nominalist-positivist view of science and models, is the belief that science can only deal with observable regularity patterns of a more or less lawlike kind. Only data matters and trying to (ontologically) go beyond observed data in search of the underlying real factors and relations that generate the data is not admissable. All has to take place in the econometric mind’s model since the real factors and relations according to the econometric (epistemologically based) methodology are beyond reach since they allegedly are both unobservable and unmeasurable. This also means that instead of treating the model-based findings as interesting clues for digging deepeer into real structures and mechanisms, they are treated as the end points of the investigation.

The critique put forward here is in line with what mathematical statistician David Freedman writes in Statistical Models and Causal Inference (2010):

In my view, regression models are not a particularly good way of doing empirical work in the social sciences today, because the technique depends on knowledge that we do not have. Investigators who use the technique are not paying adequate attention to the connection – if any – between the models and the phenomena they are studying. Their conclusions may be valid for the computer code they have created, but the claims are hard to transfer from that microcosm to the larger world …

Given the limits to present knowledge, I doubt that models can be rescued by technical fixes. Arguments about the theoretical merit of regression or the asymptotic behavior of specification tests for picking one version of a model over another seem like the arguments about how to build desalination plants with cold fusion and the energy source. The concept may be admirable, the technical details may be fascinating, but thirsty people should look elsewhere …

Causal inference from observational data presents may difficulties, especially when underlying mechanisms are poorly understood. There is a natural desire to substitute intellectual capital for labor, and an equally natural preference for system and rigor over methods that seem more haphazard. These are possible explanations for the current popularity of statistical models.

Indeed, far-reaching claims have been made for the superiority of a quantitative template that depends on modeling – by those who manage to ignore the far-reaching assumptions behind the models. However, the assumptions often turn out to be unsupported by the data. If so, the rigor of advanced quantitative methods is a matter of appearance rather than substance.

Econometrics is basically a deductive method. Given the assumptions (such as manipulability, transitivity, separability, additivity, linearity etc) it delivers deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. Real target systems are seldom epistemically isomorphic to axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of the conclusions reached from within these epistemically convenient models/systems. Causal evidence generated by statistical/econometric procedures like regression analysis may be valid in “closed” models, but what we usually are interested in, is causal evidence in the real target system we happen to live in.

Most advocates of econometrics and regression analysis want to have deductively automated answers to fundamental causal questions. Econometricians think – as David Hendry expressed it in Econometrics – alchemy or science? (1980) – they “have found their Philosophers’ Stone; it is called regression analysis and is used for transforming data into ‘significant results!'” But as David Freedman poignantly notes in Statistical Models: “Taking assumptions for granted is what makes statistical techniques into philosophers’ stones.” To apply “thin” methods we have to have “thick” background knowledge of what’s going on in the real world, and not in idealized models. Conclusions can only be as certain as their premises – and that also applies to the quest for causality in econometrics and regression analysis.

Without requirements of depth, explanations most often do not have practical significance. Only if we search for and find fundamental structural causes, can we hopefully also take effective measures to remedy problems like e.g. unemployment, poverty, discrimination and underdevelopment. A social science must try to establish what relations exist between different phenomena and the systematic forces that operate within the different realms of reality. If econometrics is to progress, it has to abandon its outdated nominalist-positivist view of science and the belief that science can only deal with observable regularity patterns of a more or less law-like kind. Scientific theories ought to do more than just describe event-regularities and patterns – they also have to analyze and describe the mechanisms, structures, and processes that give birth to these patterns and eventual regularities.

Limiting model assumptions in economic science always have to be closely examined since if we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we “export” them to our “target systems”, we have to be able to show that they do not only hold under ceteris paribus conditions and a fortiori only are of limited value to our understanding, explanations or predictions of real economic systems. As the always eminently quotable Keynes writes (emphasis added) in Treatise on Probability (1921):

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state … Yet there might well be quite different laws for wholes of different degrees of complexity, and laws of connection between complexes which could not be stated in terms of laws connecting individual parts … If different wholes were subject to different laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts … These considerations do not show us a way by which we can justify induction … /427 No one supposes that a good induction can be arrived at merely by counting cases. The business of strengthening the argument chiefly consists in determining whether the alleged association is stable, when accompanying conditions are varied … /468 In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, yours truly remains a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that neither Haavelmo, nor the legions of probabilistic econometricians following in his footsteps, give supportive evidence for their considering it “fruitful to believe” in the possibility of treating unique economic data as the observable results of random drawings from an imaginary sampling of an imaginary population. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that econometrics on the whole has not delivered “truth”. And I doubt if it has ever been the intention of its main protagonists.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a cautious attitude towards probabilistic inferences in economic contexts. Science should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts” [Keynes 1971-89 vol XVII:427]. We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour, since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality. But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive, not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes. A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

Real world social systems are not governed by stable causal mechanisms or capacities. The kinds of “laws” and relations that econometrics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of mainstream economic theoretical modeling – rather useless.

Forecasting and prediction — the illusion of control

16 January, 2014 at 19:23 | Posted in Economics, Statistics & Econometrics | 1 Comment

 

Yours truly on microfoundations in Real-World Economics Review

15 January, 2014 at 14:20 | Posted in Economics | 1 Comment

contents66-1
Yours truly has a paper on microfoundations — Micro versus Macro — in the latest issue of Real-World Economics Review (January 2014).

To read the other papers of the new issue of RWER — click here.

Next Page »

Blog at WordPress.com.
Entries and comments feeds.