Bayesianism and noninformative priors – lost causes (wonkish)

14 July, 2013 at 12:00 | Posted in Statistics & Econometrics, Theory of Science & Methodology | Comments Off on Bayesianism and noninformative priors – lost causes (wonkish)

I like to say that noninformative priors are the perpetual motion machines of statistics. Everyone wants one but they don’t exist.

By definition, a prior represents information. So it should come as no surprise that a prior cannot represent lack of information.

The first “noninformative prior” was of course the flat prior. The major flaw with this prior is lack of invariance: if it is flat in one parameterization it will not be flat in most other parameterizations. Flat prior have lots of other problems too. See my earlier post here.

The most famous noninformative prior (I’ll stop putting quotes around the phrase from now on) is Jeffreys prior which is proportional to the square root of the determinant of the Fisher information matrix. While this prior is invariant, it can still have undesirable properties …

A more fundamental question is: what does it mean for a prior to be noninformative? Of course, people have argued about this for many, many years. One definition, which has the virtue of being somewhat precise, is that a prior is noninformative if the {1-\alpha} posterior regions have frequentist coverage equal (approximately) to {1-\alpha}. These are sometimes called “matching priors.”

BayesPRIn general, it is hard to construct matching priors especially in high-dimensional complex models. But matching priors raise a fundamental question: if your goal is to match frequentist coverage, why bother with Bayes at all? Just use a frequentist confidence interval.

These days I think that most people agree that the virtue of Bayesian methods is that it gives you a way to include prior information in a systematic way. There is no reason to formulate a “noninformative prior.”

On the other hand, in practice, we often deal with very complex, high-dimensional models. Can we really formulate a meaningful informative prior in such problems? And if we do, will anyone care about our inferences?

In 1996, I wrote a review paper with Rob Kass on noninformative priors (Kass, Robert E and Wasserman, Larry. (1996). The selection of prior distributions by formal rules. Journal of the American Statistical Association, 91, 1343-1370) … One of our conclusions was:

“We conclude that the problems raised by the research on priors chosen by formal rules are serious and may not be dismissed lightly: When sample sizes are small (relative the number of parameters being estimated), it is dangerous to put faith in any default solution; but when asymptotics take over, Jeffreys’s rules and their variants remain reasonable choices.”

Looking at this almost twenty years later, the one thing that has changed is the “the number of parameters being estimated” which these days is often very, very large.

My conclusion: noninformative priors are a lost cause.

Larry Wasserman

 

 

Advertisements

WYSIATI and market failures

14 July, 2013 at 11:05 | Posted in Economics | Comments Off on WYSIATI and market failures

cn_image.size.kahneman Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking … that I will use a cumbersome abbreviation for it: WYSIATI, which stands for what you see is all there is …

I have had several occasions to ask founders and participants in innovative start-ups a question: To what extent will the outcome of your effort depend on what you do in your firm? This is evidently an easy question; the answer comes quickly and in my small sample it has never been less than 80% … They are surely wrong: the outcome of a start-up depends as much on the achievements of its competitors and on changes in the markets as on its own efforts …

The familiar processes of WYSIATI and substitution produce both competition neglect and the above-average effect. The consequence of competition neglect is excess entry: more competitors enter the market than the market can profitably sustain, so their average outcome is a loss.

Daniel Kahneman

David Graeber, skulder och jubelår

14 July, 2013 at 10:30 | Posted in Economics, Politics & Society | Comments Off on David Graeber, skulder och jubelår

9789171733948_200Skulden kan både vara fängelse och drömfabrik … Denna skuldens dubbelnatur är också avstampet i antropologen David Graebers ambitiösa och uppmärksammade tegelsten i ämnet, Skuld. De första 5000 åren (övers: Joel Nordqvist; Daidalos, 537 s), som inleds med ett gammalt amerikanskt talesätt: Om du är skyldig en miljon så äger banken dig, om du är skyldig en miljard så äger du banken …

De institutioner som skapats, som till exempel Internationella Valutafonden IMF, är till för att skydda dem som lånat ut pengar, enligt principen man måste betala sina skulder.

Krispolitiken i Europa har också byggt på denna tanke. När IMF tillsammans med EU har räddat länder har hjälpen bestått av ännu mer lån villkorade av tuffa sparkrav, en kombination som förvandlat livbojen till en blyväst. Resultatet är välkänt: socialt armod och folkligt missnöje.

Det är också detta som skiljer ut dagens epok från historien. Skuldkriser är förvisso inget nytt. Det som saknas i dag är en effektiv mekanism för att lösa dem. Där finns kanske något att lära av historien. I Mesopotamien lät kungen Uruinimgina i samband med nyårsceremonierna år 2350 f Kr utlysa en allmän skuldamnesti där utestående lån annullerades …

När Hammurabi i Babylon år 1761 f Kr skrev av skulderna var syftet att se till att de starka inte kunde förtrycka de svaga. Även i Bibeln talas det om jubelår vart femtionde år där skulder efterskänks. Något som också förekommit i modern tid. Robert Kuttner skriver i en artikel i New York Review of Books att Tyskland, en av de pådrivande krafterna bakom dagens europeiska krispolitik, byggde sitt eget ekonomiska välstånd på en av de mest storsinta skuldavskrivningarna någonsin 1948, när hela 93 procent av skuldbördan upplöstes. De allierade var angelägna om att inte upprepa misstagen efter första världskriget. Tysklands skuldnivå krympte från 675 procent (!) av BNP 1939 till 12 procent i början av 50-talet, något som lade grunden för efterkrigstidens ekonomiska mirakel …

Den heliga principen att alla måste betala sina skulder har avslöjats som en lögn, menar han. Som det har visat sig är det inte alla som måste betala sina skulder. Bara en del av oss måste göra det. Det är ett debattinlägg som är högaktuellt. Många ekonomer har efterlyst olika sätt att med penndrag istället för med spariver trolla bort delar av de stora skulderna.

När IMF i början av juni släppte en självkritisk rapport om hur fonden hanterat den grekiska skuldkrisen var en av slutsatserna att för stor hänsyn tagits till långivarnas intressen. De grekiska skulderna borde i ett mycket tidigare skede ha skrivits ned eftersom skuldbördan ändå var uppenbart ohållbar. En till synes nyvunnen insikt som alltså är nästan lika gammal som skuldbegreppet självt.

Andres Cervenka

How free markets corrode your character

13 July, 2013 at 15:52 | Posted in Economics | Comments Off on How free markets corrode your character

For centuries, economists have argued that nothing beats a free market for efficiency. Unfettered competition leads to lower prices and better products, more innovation and greater choice. But market forces may also make people less ethical and more selfish.

There is no shortage of speculation on the topic by philosophers and social scientists. But the supply of empirical evidence on the topic has not kept up with the demand. Armin Falk and Nora Szech, two behavioural economists from Germany, have made a contribution.

In a series of experiments, a group of German volunteers were offered a choice between saving the life of a mouse or receiving a cash payment and having the mouse killed. When asked in private, 54 percent chose to forego 10 euros to keep the creature alive. But when the value of the murine life was negotiated by two volunteers in a marketplace, only a quarter of the people chose mercy. And when multiple buyers and sellers were around, the monetary value of a mouse’s life quickly fell far below 10 euros. For people deciding on their own, the experimenters had to offer 47 euros to induce a similar willingness to accept a mouse-death.

The economists used so-called “surplus” mice for their experiment, which would ordinarily have been killed because they cannot be used for research. They noted in an article entitled “Morals and Markets” in the magazine Science that, “as a consequence of our experiment, many mice that would have otherwise have been killed right away were allowed to live for roughly two years.”

The experimental economists drew the stark conclusion: “Market interaction displays a tendency to lower moral values, relative to individually stated preferences.” It seems that the impersonal market diminishes the feeling of personal responsibility. Guilt is divided when trading takes the place of explicit individual decisions. The observation of others engaging in ethically questionable behaviour might increase the willingness to follow suit.

The disconnect between conscience and price can be seen outside of the behaviour economists’ laboratory. The same people who are privately appalled by the horrific working conditions of textile workers in poor countries are quite happy to search for bargains on the latest fashions.

The mice saved in the Falk and Szech experiment lived an additional two years. The sociological lesson could last much longer. Even if free market forces can make activities such as education and health care more efficient, bargaining and competition bring with them a significant hidden moral cost.

Olaf Storbeck

Almedalstramset

13 July, 2013 at 15:41 | Posted in Politics & Society | Comments Off on Almedalstramset

Jag är nyligen hemkommen från en resa till Tyskland, som på många sätt är ett riktigt land. När jag tittar på TV på hotellet slipper jag framförallt de småborgerligt mysiga Gärdet-töntarna, utan får oberoende av kanal en klassisk nyhetsuppläsare med neutral uppsyn …

Eftersom tyska är ett något större språk än svenska, finns det riktiga tidningar som har daglig utgivning, exempelvis Junge Welt. Denna kan köpas i de flesta normala tidningskiosker, utan att störas av det svenska märkliga distributionssystemet, trots att tidningen är uttalat marxistisk. Världen blir plötsligt ännu lite större än om man läser Proletären och ofantligt mycket större än om man läser den malliga morgontidningen DN, vilken med knapp nöd kan se utanför tullarna utan hjälp av TT och AP.

thirdColumnMen framförallt: När jag återkommer till den inskränkta bananmonarkin Sverige är jag i lycklig okunnighet om att någonting pågått på en perifer ö i Östersjön. Jag hade faktiskt förträngt det korkade marknadsgycklet Almedalen. Jag hade i min enfald inbillat mig att redaktionerna äntligen skulle fatta att det är värdelös journalistik, eller rättare sagt knappast ens journalistik överhuvud, när man skickar folk på betald mingelsemester till Gotland. Jag är naturligtvis dum, eftersom både politiker, journalister och marknadsnissar har ett gemensamt intresse av att låtsas som att gycklet är viktigt för ”demokratin”. I själva verket vill hela gänget ha gratis semester och gratis snittar. Den allmänhet som förväntas ta del av detta skiter till 99% fullständigt i vad som händer i marknadsstånden, eftersom det är så uppenbart att alltsammans påminner om tvättmedelsreklam på TV4. Trots mitt ganska stora politiska intresse skulle det inte falla mig in att lägga tid på Aftonbladets direktsändningar eller liknande tramserier. Lägg ner skiten, möjligen kan det sänka skatten lite, vilket som bekant är borgarnas överordnade mål.

Fattar inte varför det inte stod något om Almedalen i Frankfurter Allgemeine heller, där brukar de flesta stora händelser kommenteras…

barnilsson

On Milton Friedman’s “overcleverness”

13 July, 2013 at 12:57 | Posted in Economics, Theory of Science & Methodology | 2 Comments

 

Paul Diesing, “Hypothesis Testing and Data Interpretation: The Case of Milton Friedman,” Research in the History of Economic Thought and Methodology, 1985, vol. 3, pp. 61-69.

[h/t Jan Milch]

What we do in life echoes in eternity

13 July, 2013 at 12:29 | Posted in Education & School | Comments Off on What we do in life echoes in eternity

 

In her breathtakingly simple, moving and beautiful speech, Malala Yousafzai yesterday wrote herself into history. A more fortright plaidoyer for what really can change the world – empowering knowledge and education for all – has seldom been heard. Malala is a living proof that not even the most heinous totalitarianism can defeat young people’s call for for education and justice.

Why assuming ergodicity makes economics totally irrelevant

11 July, 2013 at 13:10 | Posted in Economics, Theory of Science & Methodology | 4 Comments

Suppose I offer you a simple gamble. Throw a dice: If you get a six, you win $10; if not, you lose $1. The loss is more likely; the win brings more money. Willing to play?

The generally accepted way for deciding in such cases — developed originally by the French mathematician Blaise Pascal in the 17th century — is to think of probabilities. The outcome will always be a win or loss, but imagine playing millions of times. What will happen on average?

Clearly, you’ll lose $1 about five times out of six, and you’ll win $10 about one time out of six. Over many gambles, this averages out to about 83 cents per try. Hence, the gamble has a positive “expected” payoff and is worth it, even if the gain is trifling. Play a million times and you’re sure to win big.

tumblr_mn26z9Bmbm1rq1w2xo2_500But here’s something odd. Suppose I offer precisely the same gamble, only scaled up. Roll a six and you now win not $10, but 10 times your total current wealth; if you roll anything else, you lose your entire wealth (including property, pensions and all possessions). Your expected profit is now far bigger — equal to 83 percent of your total current wealth. Still want to play?

It turns out that most people won’t take the latter bet, even though it will, on average, pay off handsomely. Why not? For most of us, putting everything on the line seems too risky. Intuitively, we understand that getting wiped out carries a brutal finality, curtailing future options and possibilities.

Economic theories generally ascribe such cautious behavior to psychology. Humans are “risk averse,” some of us more than others. But there’s a fundamental error in this way of thinking that still remains largely unappreciated — even though it casts a long and distorting shadow over everything from portfolio theory to macroeconomics and financial regulation. Economics, in following Pascal, still hasn’t faced up honestly to the problem of time.

Anyone who faces risky situations over time — and that’s essentially everyone — needs to handle those risks well, on average, over time, with one thing happening after the next. The seductive genius of the concept of probability is that it removes this history aspect, and estimates the average payoff by thinking of a single gamble alone, with two outcomes. It imagines the world splitting with specific probabilities into parallel universes, one thing happening in each. The expected value doesn’t reflect an average over time, but over possible outcomes considered outside of time …

Especially whenever downside risks get large, real outcomes averaged through time are much worse than the expected value would predict. Even in the absence of risk aversion, there can be sound mathematical reasons for being unwilling to take on gambles (or projects), despite wildly positive expected payoffs …

So what? Well, the assumption of the equality of these different averages — technically known as the assumption of “ergodicity” — is considered a given by most of contemporary economics. It makes the mathematics easier in the financial portfolio theory that influences countless investors and in frameworks for designing regulations to keep financial risks at acceptable levels. Unfortunately, this error systematically underestimates prevailing risks.

It also may encourage overly optimistic ideas about the ability of an economy to recover from a crisis. For example, those who support policies of fiscal austerity believe that companies, in seeking to maximize their profits, will naturally drive an economy back to steady growth. The economy will spring back if companies and individuals have confidence that their investments will pay off. If that’s the case, why aren’t businesses investing globally when interest rates are at historic lows. What’s holding them back?

The fairly obvious answer is serious downside risk, which makes the reticence entirely sensible — if you live in the real world where time matters.

Mark Buchanan

Modern economics – to only see what you believe

11 July, 2013 at 10:45 | Posted in Economics | Comments Off on Modern economics – to only see what you believe

 

Who benefits from QE?

11 July, 2013 at 09:58 | Posted in Economics | Comments Off on Who benefits from QE?

Quantitative easing describes a policy in which the central bank buys assets from the financial sector, and does not necessarily – as in conventional monetary policy – confine such activities to the management of the government’s own debt …

Amid all that has been written on this subject, there is very little about how it is supposed to work, or whether it has in fact worked …

QE_Logo_No_Text_SquareIn the modern financial economy, the main effect of QE is to boost asset prices, as market gyrations of recent weeks have clearly illustrated. But is the pursuit of higher asset prices an effective or desirable means of promoting economic growth? The distributional impact of the policy demands attention; the one certain consequence of boosting asset prices is that those with assets benefit relative to those without. Many people own houses – but, although in the UK, for example, we need more houses, we do not need another housing boom. The public also holds financial assets indirectly, largely through pension funds. But here there has been a paradoxical effect: because of the way pension funds are valued, QE has generally increased pension funds’ liabilities more than their assets …

Why has so much attention been given to these monetary policies with no clear explanation of how they might be expected to work and little evidence of effectiveness? The very phrase “quantitative easing” seems designed to discourage non-technical discussion. But the real answer, I fear, is all too familiar: these policies may not benefit the non-financial economy much, but they are helpful to the financial services sector and those who work in it.

John Kay

How to teach economics

9 July, 2013 at 23:54 | Posted in Economics | Comments Off on How to teach economics

How is economics generally taught? In both macro and micro, most of the time we teach the formalisation. This is understandable (it is what has advanced the discipline and made it science like) and to a degree appropriate (understanding models is difficult). However there is a real danger that teaching this stuff crowds out all else. I used not to be concerned about this for macro, because I saw the discipline as inherently progressive, where the data would naturally push advances in the right direction. (My excuse for believing this in part comes from my background in building structural econometric models, where the data really did do that.)

If that is your view, you are likely to be a little dismissive about things like economic history, economic methodology or the history of economic thought. After all, most scientists do not worry too much about these things in their own discipline, and economics tries to be like a science. Even if we take a more realistic view, and think that economists are more like doctors (who fail to understand quite a lot), doctors do not spend too much time thinking about things like the methodology of medicine.

I changed my view in the last few years as a result of both the financial crisis and the subsequent domination of austerity policies. Teaching just what can be currently formalised in what now passes as a rigorous manner excludes too much of what is important. Of course we (hopefully) tell students that there are gaps in what economists can do this way, but perhaps these gaps need to be given a little more space than footnotes. There is a great deal of knowledge and insight in less formal economic reasoning, insight that can too easily be dismissed. Unfortunately it is natural for future academics or policy makers to believe that what is taught in undergraduate or graduate macro is what is important, rather than what has so far been formalised, or what the demands of this particular time and context require formalising, or worse still what political or ideological forces wish to formalise.

Simon Wren-Lewis

Keynes and the atomistic fallacy

9 July, 2013 at 13:47 | Posted in Economics, Theory of Science & Methodology | Comments Off on Keynes and the atomistic fallacy

At a social level too this phenomenon of aggregating seems to be massively important … This brings me to another one of Lars Syll’s posts on Keynesian economics and probability. Syll quotes a passage from Keynes’ A Treatise on Probability that I think is worth reproducing here in full:

treatprobThe kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state … Yet there might well be quite different laws for wholes of different degrees of complexity, and laws of connection between complexes which could not be stated in terms of laws connecting individual parts … If different wholes were subject to different laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts … These considerations do not show us a way by which we can justify induction … No one supposes that a good induction can be arrived at merely by counting cases. The business of strengthening the argument chiefly consists in determining whether the alleged association is stable, when accompanying conditions are varied … In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.”

For Keynes modern science is tending toward an atomistic or fragmented view of reality and this is likely to prove dysfunctional … Keynes, on the other hand, recognised that understanding the world required more than that.

This is especially so in social science especially because, as we discussed earlier, people not only tend to think in terms of aggregations, but they also actively build their world into aggregates and themselves into groups and teams. To be unable to understand this tendency toward aggregation is to be unable to understand the social world.

This tendency, however, makes many forms of modern econometric technique useless. These techniques essentially seek to identify correlations between variables in the data. They do so either haphazardly or by plugging in some sort of model. Both methods are unlikely to work for complex data. The haphazard method will likely just produce dross while the idea that a simple model can capture complex changes in data that represents the movement through historical time is completely wrongheaded.

Models are just tools for thought and when doing economic analysis we must apply models loosely (if at all). We must learn — not through study, but through practice — to know when a given model or idea is useful in explaining a change in the economic headwinds. The cognitive mechanism that we use to determine this is similar to that which distinguishes in the above picture between the two faces and the vase. The same data, the same event can be looked at through many different lenses, and a skilled analyst must be able to not only see as many different perspectives as he or she can but also judge as to which is the most suitable. This requires a great deal of flexibility and, in my opinion, separates those who are actually interested in doing economics from those who are just interested in tinkering.

Philip Pilkington

Lord Keynes on yours truly on probability and economics

9 July, 2013 at 12:40 | Posted in Statistics & Econometrics, Theory of Science & Methodology | 1 Comment

Lars P. Syll has a great series of posts on his blog on the subject of probability theory and probability and economics, which should be required reading for anyone interested in this subject:

Lars P. Syll, “Economics and Probability, 27 June, 2013.

Lars P. Syll, “Markov’s Inequality,” 12 June, 2013.

Lars P. Syll, “Chebyshev’s Inequality Theorem,” 12 June, 2013.

Lars P. Syll, “What is Randomness?,” 28 May, 2013.

Lars P. Syll, “On the Impossibility of Predicting the Future,” 28 May, 2013.

Lars P. Syll, “Probabilities – From Where do we get Them?,” 8 May, 2013.

Lars P. Syll, “Modern Econometrics – A Critical Realist Critique (wonkish),” 7 May, 2013.

Lars P. Syll, “Probabilistic Reasoning and its Limits (wonkish),” 13 April, 2013.

Lars P. Syll, “Ergodicity – Probabilistic Thinking gone Awry,” 5 April, 2013.

Lars P. Syll, “The Limits to Probabilistic Reasoning,” 19 March, 2013.

Lars P. Syll, “Ergodicity and the Law of Large Numbers (wonkish),” 5 March, 2013.

Lars P. Syll, “What’s Wrong with Bayesianism (II),” 4 March, 2013.

Lars P. Syll, “What’s Wrong with Bayesian Probability?,” 2 March, 2013.

Lars P. Syll, “Bayesian Decision Theory – A Critique,” 26 February, 2013.

Lars P. Syll, “On Bayesianism, Uncertainty and Consistency in ‘Large Worlds’,” 25 February, 2013.

Lars P. Syll, “On Probabilism and Statistics,” 8 February, 2013.

Lars P. Syll, “On the Non-Equivalence of Keynesian and Knightian Uncertainty (wonkish),” 5 February, 2013.

“Keynes on Statistics and Evidential Weight,” 25 January, 2013.

Lars P. Syll, “How do we Attach Probabilities to the World?,” 18 January, 2013.

Lars P. Syll, “Probability and Economics,” 17 January, 2013.

Lars P. Syll, “Markov’s Inequality,” 18 December, 2012.

Lars P. Syll, “The Arrow of Time and the Importance of Time Averages and Non-Ergodicity (wonkish),” 31 October, 2012.

Lars P. Syll, “Keynes vs. Bayes on Information and Uncertainty (wonkish),” 20 October, 2012.

Lars P. Syll, “Did Frank Ramsey really make Keynes change his View on Probability? (wonkish),” 12 September, 2012.

Lars P. Syll, “von Wright’s critique of Ramsey’s Bayesianism (wonkish),” 31 August, 2012.

Lars P. Syll, “Keynes and Knight on Uncertainty – Ontology vs. Epistemology,” 29 July, 2012.

Lars P. Syll, “Dutch Books, Money Pumps and Bayesianism,” 25 June, 2012.

Lars P. Syll, “One of the Reasons I’m a Keynesian and not a Bayesian,” 12 June, 2012.

Lars P. Syll, “Statistical Models and Causal Inference,” 11 June, 2012.

Lars P. Syll, “Bayesian Probability – A Primer,” 10 June, 2012.

Lars P. Syll, “Randomness, Fat Tails and Ergodicity – a Keynesian Perspective on Knightian Uncertainty,” 9 June, 2012.

Lars P. Syll, “So you Think you’re Rational? I bet you’re not!,” 15 May, 2012.

Lars P. Syll, “Risk and Uncertainty,” 8 May, 2012.

Lars P. Syll, “Keynes and Bayes in Paradise,” 5 May, 2012.

Lars P. Syll, “Probabilistic Econometrics – Science without Foundations (part I),” 21 February, 2012.

Great thanx to Lord Keynes – writing on one of my favourite blogs – who made this marvellous and handy  collocation of some of my posts concerning probability and economics!

Neoclassical economics – empty and uninformative storytelling

7 July, 2013 at 15:33 | Posted in Economics, Theory of Science & Methodology | 1 Comment

In everyday situations, if, in answer to an inquiry about the weather forecast, one is told that the weather will remain the same as long as it does not change, then one does not normally go away with the impression of having been particularly well informed, although it cannot be denied that the answer refers to an interesting aspect of reality, and, beyond that, it is undoubtedly true …

hans_albertWe are not normally interested merely in the truth of a statement, nor merely in its relation to reality; we are fundamentally interested in what it says, that is, in the information that it contains …

Information can only be obtained by limiting logical possibilities; and this in principle entails the risk that the respective statement may be exposed as false. It is even possible to say that the risk of failure increases with the informational content, so that precisely those statements that are in some respects most interesting, the nomological statements of the theoretical hard sciences, are most subject to this risk. The certainty of statements is best obtained at the cost of informational content, for only an absolutely empty and thus uninformative statement can achieve the maximal logical probability …

The neoclassical style of thought – with its emphasis on thought experiments, reflection on the basis of illustrative examples and logically possible extreme cases, its use of model construction as the basis of plausible assumptions, as well as its tendency to decrease the level of abstraction, and similar procedures – appears to have had such a strong influence on economic methodology that even theoreticians who strongly value experience can only free themselves from this methodology with difficulty …

Science progresses through the gradual elimination of errors from a large offering of rivalling ideas, the truth of which no one can know from the outset. The question of which of the many theoretical schemes will finally prove to be especially productive and will be maintained after empirical investigation cannot be decided a priori. Yet to be useful at all, it is necessary that they are initially formulated so as to be subject to the risk of being revealed as errors. Thus one cannot attempt to preserve them from failure at every price. A theory is scientifically relevant first of all because of its possible explanatory power, its performance, which is coupled with its informational content …

The connections sketched out above are part of the general logic of the sciences and can thus be applied to the social sciences. Above all, with their help, it appears to be possible to illuminate a methodological peculiarity of neoclassical thought in economics, which probably stands in a certain relation to the isolation from sociological and social-psychological knowledge that has been cultivated in this discipline for some time: the model Platonism of pure economics, which comes to expression in attempts to immunize economic statements and sets of statements (models) from experience through the application of conventionalist strategies …

Clearly, it is possible to interpret the ‘presuppositions’ of a theoretical system … not as hypotheses, but simply as limitations to the area of application of the system in question. Since a relationship to reality is usually ensured by the language used in economic statements, in this case the impression is generated that a content-laden statement about reality is being made, although the system is fully immunized and thus without content. In my view that is often a source of self-deception in pure economic thought …

A further possibility for immunizing theories consists in simply leaving open the area of application of the constructed model so that it is impossible to refute it with counter examples. This of course is usually done without a complete knowledge of the fatal consequences of such methodological strategies for the usefulness of the theoretical conception in question, but with the view that this is a characteristic of especially highly developed economic procedures: the thinking in models, which, however, among those theoreticians who cultivate neoclassical thought, in essence amounts to a new form of Platonism.

Hans Albert

The fundamental problem of modern economics

7 July, 2013 at 13:31 | Posted in Economics, Theory of Science & Methodology | Comments Off on The fundamental problem of modern economics

The fundamental problem of modern economics is that methods are repeatedly applied in conditions for which they are not appropriate … Specifically, modern academic economics is dominated by a mainstream tradition whose defining characteristic is an insistence that certain methods of mathematical modelling be more or less always employed in the analysis of economic phenomena, and are so in conditions for which they are not suitable.
 
0
 
Fundamental to my argument is an assessment that the application of mathematics involves more than merely the introduction of a formal language. Of relevance here is recognition that mathematical methods and techniques are essentially tools. And as with any other tools (pencils, hammers, drills, scissors), so the sorts of mathematical methods which economists wield (functional relations, forms of calculus, etc.) are useful under some sets of conditions and not others.

The specific conditions required for the sorts of mathematical methods that economists continually wield to be generally applicable, I have shown, are a ubiquity of (deterministic or stochastic) closed systems. A closed system is simply one in which an event regularity occurs. Notice that these closures are as much presupposed or required by the ‘newer’ approaches to mathematical economics, those often referred to as non-linear modelling, complexity modelling, agent-based modelling, model simulations, and so on (including those developed under the head of behavioural or neuro- economics), as they are by the more traditional forms of micro, macro and econometric modelling.

The most obvious scenario in which a prevalence of such closures would be expected is a world 1) populated by sets of atomistic individuals or entities (an atom here being an entity that exercises its own separate, independent, and invariable effect, whatever the context); where 2) the atoms of interest exist in relative isolation (so allowing the effects of the atoms of interest to be deducible/predictable by barring the effects of potentially interfering factors). Not surprisingly the latter two (ontological) presuppositions are easily shown to be implicit in almost all contemporary economic modelling contributions …

However, explicit, systemic and sustained (ontological) analysis of the nature of social reality reveals the social domain not to be everywhere composed of closed systems of sets of isolated atoms. Rather social reality is found to be an open, structured realm of emergent phenomena that, amongst other things, are processual (being constantly reproduced and transformed through the human practices on which they depend), highly internally related (meaning constituted though [and not merely linked by] their relations with each other – e.g., employer/employee or teacher/ student relations), value-laden and meaningful, amongst much else …

Clearly if social phenomena are highly internally related they do not each exist in isolation. And if they are processual in nature, being continually transformed through practice, they are not atomistic. So the emphasis on the sorts of mathematical modelling methods that economists employ necessarily entails the construction of economic narratives – including the sorts of axioms and assumptions made and hypotheses entertained – that, at best, are always but highly distorted accounts of the complex phenomena of the real open social system … It is thus not at all surprising that mainstream contributions are found continually to be so unrealistic and explanatorily limited.

Employing the term deductivism to denote the thesis that closed systems are essential to social scientific explanation (whether the event regularities, correlations, uniformities, laws, etc., are either a prior constructions or a posterior observations), I conclude that the fundamental source of the discipline’s numerous, widespread and long lived problems and failings is precisely the emphasis placed upon forms of mathematical deductivist reasoning.

Tony Lawson

Axiomatic variation as immunization against critique

7 July, 2013 at 13:02 | Posted in Economics, Theory of Science & Methodology | Comments Off on Axiomatic variation as immunization against critique

Another feature of the standard treatment of economic models is that their assumptions might be modified and changed. Generally, there is no reason to reject such a treatment, but, on the contrary, this procedure is quite common in many fields of scientific research. Usually, scientists vary the utilized auxiliary hypotheses in order to apply a specific set of general laws to a new domain.

In neoclassical economics, however, any assumption may be changed even if it has attained the status of an explanatory assumption in a given context. This is a much more controversial procedure, which is fairly uncommon in other fields of inquiry. Especially, for the domain of the natural sciences, where the underlying laws are often assumed to be ‘eternal’ in human terms, this is quite extraordinary. For the social sciences, we find that different regularities and mechanisms may hold in different times or may apply to varying geographical areas. However, from an epistemological point of view, different explanatory statements are either competing explanations or they apply to different spheres in the continuum of time and space and, thus, can be separated with respect to their domains of application by the introduction of suitable auxiliary hypotheses. In sum, the practice of neoclassical economics to freely modify even those assumptions that are considered to be explanatory statements seems quite unique within empirical science …

In this context, the theory of asymmetric information provides an informative case. It supposes that market participants have varying degrees of information, and thereby provides an alternative assumption non-A to the contested hypothesis of fully informed individuals. By adding some assumptions on the distribution of information among market participants, where, in the standard case, consumers are assumed to have less knowledge than producers, this modification can be reapplied to the analysis of markets as a complement to the theory of perfect competition. So, if markets in a given case seem to work well and deliver stable outcomes, the model of perfect competition, based on fully informed individuals, serves as a reference point for corroborating ‘neoclassical theory’. But if markets fail to work well, it is supposed to be a case of asymmetric information – and, thus, just another instance of corroboration for neoclassical theory. Of course, a theory, which is able to explain a result E as well as the corresponding result non-E surely seems powerful. By offering alternative explanations (non-E) within the same framework, the practice of axiomatic variation makes it easier to ignore possible alternative explanations of market behaviour …

Neoclassical economics is manifoldly rich in examples for axiomatic variation. The ever-changing definitions of preference structures to adapt the economic approach for an ever-growing number of topics is considered as everyone’s favourite example … But not only preference sets but also the nature of rationality as such is fluctuating constantly across models … Another interesting aspect of axiomatic variation is that it allows for a partial incorporation of assumptions and results from competing theories. A point in case is provided by the development of Keynesian theory, where some of Keynes’ insights have been integrated into neoclassical theory via several steps (e.g. Hicks, 1937; Modigliani, 1944), which served as a reference point to argue that neoclassical economics contains Keynesian arguments as special cases. A key aim of more neoclassically inspired macroeconomists was to reproduce the Keynesian result of involuntary unemployment by introducing frictions on labour markets. While Keynes established a connection between employment and effective demand to reach the same conclusion, neoclassical economists replaced some assumption A in their model (‘all markets work frictionless’) with another assumption non-A (‘all markets work frictionless, except for the labour market’) to reach the desired result T′ (involuntary unemployment) without adopting a Keynesian viewpoint …

In sum, axiomatic variation serves as a powerful tool for immunization against critique. Although not all of the assumptions in economics are flexible to the same degree, no assumption that comes to mind is completely immune to the procedure of axiomatic variation.

Jakob Kapeller

(h/t Dwayne Woods)

DSGE models – empirically irrelevant and logically incoherent

6 July, 2013 at 17:42 | Posted in Economics, Theory of Science & Methodology | 2 Comments

Something about the way economists construct their models doesn’t sit right.

Economic models are often acknowledged to be unrealistic, and Friedmanite ‘assumptions don’t matter‘ style arguments are used to justify this approach. The result is that internal mechanics aren’t really closely examined. However, when it suits them, economists are prepared to hold up internal mechanics to empirical verification – usually in order to preserve key properties and mathematical relevance. The result is that models are constructed in such a way that, instead of trying to explain how the economy works, they deliberately avoid both difficult empirical and difficult logical questions. This is particularly noticeable with the Dynamic Stochastic General Equilibrium (DSGE) models that are commonly employed in macroeconomics …

The fact is that DSGE models themselves are not “empirically relevant”. They assume that agents are optimising, that markets tend to clear, that the economy is an equilibrium time path. They use ‘log linearisation’, a method which doesn’t even pretend to do anything other make the equations easier to solve by forcibly eliminating the possibility of multiple equilibria. On top of this, they generally display poor empirical corroboration. Overall, the DSGE approach is structured toward preserving the use of microfoundations, while at the same time invoking various – often unrealistic – processes in order to generate something resembling dynamic behaviour.

Economists tacitly acknowledge this, as they will usually say that they use this type of model to highlight one or two key mechanics, rather than to attempt to build a comprehensive model of the economy. Ask an economist if people really maximise utility; if the economy is in equilibrium; if markets clear, and they will likely answer “no, but it’s a simplification, designed to highlight problem x”. Yet when questioned about some of the more surreal logical consequences of all of the ‘simplifications’ made, economists will appeal to the real world. This is not a coherent perspective.

Neoclassical economics uses an ‘axiomatic deductive‘ approach, attempting to logically deduce theories from basic axioms about individual choice under scarcity. Economists have a stock of reasons to do this: it is ‘rigorous’; it bases models on policy invariant parameters; it incorporates the fact that the economy ultimately consists of agents consciously making decisions, etc. If you were to suggest internal mechanics based on simple empirical observations, conventional macroeconomists would likely reject your approach.

Modern DSGE models are constructed using these types of axioms … This allows macroeconomists to draw clear mathematical implications from their models, while the assumptions are justified on the grounds of empiricism … Yet the model as a whole has very little to do with empiricism, and economists rarely claim otherwise. What we end up with is a clearly unrealistic model, constructed not in the name of empirical relevance or logical consistency, but in the name of preserving key conclusions and mathematical tractability …

A consequence of this methodological ‘dance’ is that it can be difficult to draw conclusions about which DSGE models are potentially sound. One example of this came from the blogosphere, via Noah Smith. Though Noah has previously criticised DSGE models, he recently noted – approvingly – that there exists a DSGE model that is quite consistent with the behaviour of key economic variables during the financial crisis. This increased my respect for DSGE somewhat, but my immediate conclusion still wasn’t “great! That model is my new mainstay”. After all, so many DSGE models exist that it’s highly probable that some simplistic curve fitting would make one seem plausible. Instead, I was concerned with what’s going on under the bonnet of the model – is it representative of the actual behaviour of the economy?

Sadly, the answer is no. Said DSGE model includes many unrealistic mechanics: most of the key behaviour appears to be determined by exogenous ‘shocks’  to risk, investment, productivity etc without any explanation. This includes the oft-mocked ‘Calvo fairy’, which imitates sticky prices by assigning a probability to firms randomly changing their prices at any given point. Presumably, this behaviour is justified on the grounds that all models are unrealistic in one way or another. But if we have constructed the model to avoid key problems … how can we justify using something as blatantly unrealistic as the Calvo fairy? Either we shed a harsh light on all internal mechanics, or on none …

I am aware that DSGE and macro are only a small part of economics, and many economists agree that DSGE – at least in its current form – is yielding no fruit (although these same economists may still be hostile to outside criticism). Nevertheless, I wonder if this problem extends to other areas of economics, as economists can sometimes seem less concerned with explaining economic phenomena than with utilising their preferred approach. I believe internal mechanics are important, and if economists agree, they should expose every aspect of their theories to empirical verification, rather merely those areas which will protect their core conclusions.

Unlearning Economics

To me this confirms what I have been arguing for years now – neoclassical economic theory is in the story-telling business.

Neoclassical economics has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence only plays a minor role in economic theory, where models largely function as a substitute for empirical evidence. But “facts kick”, as Gunnar Myrdal used to say. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on axiomatic-deductivist modeling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.

Modern macroeconomics builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are genuinely uncertain. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

Some macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption.

If macroeconomic models – no matter of what ilk –  build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Incompatibility between actual behaviour and the behaviour in macroeconomic models building on representative actors and rational expectations-microfoundations is not a symptom of “irrationality”. It rather shows the futility of trying to represent real-world target systems with models flagrantly at odds with reality.

A gadget is just a gadget – and brilliantly silly DSGE models do not help us working with the fundamental issues of modern economies.

Riksbankens usla prognoser

6 July, 2013 at 11:17 | Posted in Economics, Statistics & Econometrics | 1 Comment

Nationalekonomer hävdar ofta – vanligtvis med hänvisning till Milton Friedman och hans metodologiska individualism – att det inte gör så mycket om de antaganden som deras modeller bygger på är realistiska eller ej. Vad som betyder något är om de prediktioner/förutsägelser/prognoser som modellerna resulterar i visar sig vara riktiga eller ej.

Om det verkligen är så, är den enda slutsats vi kan dra att dessa modeller och prognoser hör hemma i papperskorgen. För oj vad fel de har haft!

Riksbanken har sedan 2007 använt sig av så kallade räntebanor som prognosmedel. Genomslaget har blivit stort – trots att det är lätt att konstatera att prognoserna genomgående slagit rejält fel. Som de flesta andra prognosmakare missade Riksbanken både finanskrisen och dess allvarliga effekter. Men också under de senaste åren tre-fyra åren har Riksbankens ränteprognoser visat sig vara usla:

SvD Näringsliv har gått igenom alla 36 publicerade räntebanor. Nästan alla följer samma mönster. I slutet av prognostiden stiger räntan. De tidiga räntebanorna slutade med en ränta på över 4 procent. Den senaste på cirka 2,5 procent. Ingen av räntebanorna har ännu så länge visat rätt på tre års sikt. Sommaren 2010 spåddes det att räntan i dag skulle vara runt 3,75 procent, verkligheten är en ränta på 1 procent …

Irma Rosenberg satt i Riksbankens direktion mellan 2003 och 2008 och var beredningsansvarig för penningpolitiken när räntebanan infördes.

– Räntebanan gör att Riksbanken kan resonera mer om framtiden. Vad som kan hända och hur räntan påverkas.

Hur påverkas Riksbankens trovärdighet av att räntebanorna slår fel?

– Gör man en prognos så riskerar man alltid att få fel. Det enda vi vet är att vi någon gång kommer att komma tillbaka till ett mer normalt läge. Då kommer räntorna att vara högre än nu.

SvD

I våras bjöd yours truly in fondförvaltare Alf Riple – med bakgrund som chefsanalytiker på Nordea och rådgivare på norska Finansdepartementet – att föreläsa på min kurs Finanskriser – orsaker, förlopp och konsekvenser. Det visade sig att Alf inte bara är en god skribent, utan också en synnerligen kompetent och medryckande föreläsare. En oslagbar kombination. Och enligt Alf finns det ingen anledning att vara speciellt förvånad över Riksbankens usla prognoser:

ripleVad är värst, en dålig prognos eller ingen prognos? Svaret är enkelt. Så fort du exponeras för en prognos är du i en sämre position än du var innan …

Expertprognoser gör med all sannolikhet mer skada än nytta. Det är därför det lönar sig att snabbt bläddra förbi tidningsartiklar med rubriker som ‘Så kommer börsen gå i år’ …

Tänk dig att du har som jobb att sköta ditt företags valutaväxlingar … Du måste bestämma att antingen säkra växelkursen redan nu, eller vänta tills beloppet anländer och växla till den kurs som gäller då … Som tur är har du analytikernas dollarprognoser till hjälp. De gör det inte ett dugg lättare att förutspå dollarkursen. Men de kan hjälpa dig ändå.

Om du lyckas göra rätt spelar inte analyserna någon större roll. Men om dollarn faller som en sten och du har valt att inte säkra växelkursen, kommer företagsledningen att vilja veta varför du har sumpat bort företagets pengar … Du kan dra en lång historia om historiska valutatrender, ekonomisk tillväxt, betalningbalans och ränteskillnader. Till slut kommer alla att hålla med om att du agerade rätt mot bakgrund av den information du hade på förhand.

Analyserna gör att du kommer undan. Särskilt de som hade mest fel … Prognoserna har inget ekonomiskt värde, vare sig för företaget eller samhället. Värdet är att de räddar ditt skinn.

Med andra ord – dessa arrogant självsäkra ekonomer med sina “rigorösa” och “precisa” matematisk-statistiska-ekonometriska modeller har genomgående fel. Och det är alla vi andra som får betala för det!

Riksbanken börjar äntligen ta de svenska hushållens skuldsättning på allvar

5 July, 2013 at 14:48 | Posted in Economics, Politics & Society | 1 Comment

Riksbanken meddelar idag att man nu vill att bankerna ska lämna ut uppgifter om kundernas skulder på individnivå. Syftet är att på så vis få en mer detaljerad kunskap om hushållens skuldsättningsgrad än vad Finansinspektionens stickprov (som enbart omfattar bankernas nyutlåning från bankerna) tidigare kunnat ge.

Detta tycker yours truly är alldeles utmärkt.

Trots alla varningstecken verkar nämligen de svenska hushållen se allt ljusare på bostadsmarknaden. SEB:s boprisindikator – som visar på differensen mellan andelen hushåll som tror på stigande priser och andelen som tror på fallande priser – ligger nu på den högsta nivån sedan 2011.

Trots finanskris och oro över euron så kommer detta troligen att innebära att svenskarna fortsätter att låna mer och mer för att köpa bostäder – och ytterligare spär på den skuldvolym som bara under de senaste tre åren ökat med över 30 %.

Varje sansad bedömare inser att detta är problem som måste lösas innan bostadsbubblan spricker. I annat fall är det hög risk för att låt-gå-politiken kommer att gå igen som svinhugg – och då är det de arbetslösa, bostadslösa och skuldsatta som får ta smällarna.

Hushållens skuldsättning bottnar främst i den ökning av tillgångsvärden som letts av ökad långivning till hushållen och den därav uppkomna bostadsbubblan. På lång sikt är det självklart inte möjligt att bibehålla denna trend. Tillgångspriserna avspeglar i grunden förväntningar om framtida avkastning på investeringar. Om tillgångspriserna fortsätter öka snabbare än inkomsterna blir effekten ökad inflation med vidhängande nedjustering av tillgångarnas realvärde.

Med den skuldkvot vi ser hushållen tagit på sig riskerar vi få en skulddeflationskris som kommer att slå oerhört hårt mot svenska hushåll.

Det är djupt bekymmersamt att svenska hushåll är beredda att ta på sig så stora lån som man gör idag. Det är hög tid att den nästan exponentiella skuldsättningsutvecklingen bromsas. I annat fall kan drömmen om ett eget boende mycket väl visa sig bli en mardröm.

År 1638 kunde priset på en tulpanlök i Nederländerna vara så högt att det motsvarade två årslöner. Och hushållen var övertygade om att priserna bara skulle fortsätta att öka och öka. Som alla andra bubblor sprack dock även denna bubbla – ”tulpanmanin” – och lämnade mängder av utblottade och ruinerade människor efter sig. Liknande ting utspelade sig i exempelvis Mississippibubblan år 1720 och i IT-bubblan för tio år sedan. Hur svårt ska det vara att lära av historien?

Det är minst sagt skrämmande att bostadsbubblan bara fortsätter att växa. När den väl spricker – vilket den gör – blir krisen desto värre.

Parallel universe and time in finance and economics (wonkish)

5 July, 2013 at 10:47 | Posted in Theory of Science & Methodology | 1 Comment

The topic is time — literally — and how to think about it …

Much of the recent research on this topic has been done by Ole Peters of the London Mathematical Laboratory and Santa Fe Institute. The gist of his overall argument is that the usual ensemble averages used to compute “expected” returns in finance and economics are, in many cases, simply wrong. This is not the correct way to make decisions in the real world. We’re all used to the idea of averaging over possible outcomes, weighted by the appropriate probabilities, and asking: is the result positive or negative? If positive, then the gamble or investment is worth it, if we can accept the risk. But this is not a legitimate way of thinking, for it averages over parallel worlds, not through time — where we actually live.

The trouble is that usual average over different outcomes mixes potential worlds in which we go broke with others in we get rich, and does this mixing all at once, immediately, so that good outcomes coming from one path cancel out bad outcomes coming from other paths. Importantly, this mixing takes the often irreversible consequences of bad outcomes (bankruptcy, for example) out of the picture. If you make hugely risky investments, this average gives you full credit for all the wonderful possible outcomes, weighted appropriately for their likelihood, which of course seems sensible. What it doesn’t do is account for the very real fact that the bad outcomes may effectively wipe you out entirely and take you out of the game, making it impossible to play again — in which case you will never get to experience those eventual big payoffs …
Parallel Universes
You can do the ensemble average and say, “Hey, I expect to get exponential growth! Let’s go for it.” Then you actually play and find you get wiped out. Try again and still get wiped out. See that everyone who plays the same game also gets wiped out. Strange. You may eventually accept that thinking in terms of parallel worlds isn’t quite the right thing to do.

Mark Buchanan

Why mainstream economic models are unreliable

4 July, 2013 at 12:02 | Posted in Economics | Comments Off on Why mainstream economic models are unreliable

Economics has long had the ambition to become an “exact science”. Indeed, Walras, usually recognised as the father of modern economic theory, said in his Lettre no. 1454 to Hermann Laurent in Jaffe (1965):

“All these results are marvels of the simple application of the language of mathematics to the quantitative notion of need or utility. Refine this application as much as you will but you can be sure that the economic laws that result from it are just as rational, just as precise and just as incontrovertible as were the laws of astronomy at the end of the 17th century.”

Three-dimensional-modeling-of-chromatin-structure-from-interaction-frequency-data-using-Markov-1471-2105-12-414-S6.ogvFurthermore his successors openly declared themselves as having the same goal. However, two things raise doubts as to whether the pursuit of this ambition has achieved meaningful this phenomenon). First, as in any science, models have to be built on assumptions, and it is a standard procedure to develop those assumptions on the basis of a careful analysis of the observed empirical facts. This inductive approach, however, is not the one prevailing in economics, where widespread assumptions are based on the introspection of economists. This has been acknowledged by many distinguished economists from Pareto (1916) to Hicks (1939) to Koopmans (1957), for example. Second, and perhaps worse, the reference model in economics is one with isolated optimizing individuals. This model of “perfect competition” is considered as a useful idealization, and features such as the aggregate effects of the direct interaction between individuals are thought of as inconvenient “imperfections”. However, deviations between economic theory and reality may be of crucial importance in practice, and the consideration of the links between individuals and institutions cannot be written off as being of little relevance to the behaviour of the system as a whole. This is a lesson that is clear to all those, who are familiar with the analysis of complex systems. Given the systemic impact of certain financial instruments (such as large leverage effects, the market for credit default swaps, etc.), it would seem to be unreasonable to put too much trust in conventional economic models, in which the structure of the interactions between the participants in the system is not included in the underlying assumptions.

Alan Kirman & Dirk Helbing

Social causation

3 July, 2013 at 10:43 | Posted in Theory of Science & Methodology | Comments Off on Social causation

correlation-causationThe idea of social causation is a difficult one, as we dig more deeply into it. What does it mean to say that “poor education causes increased risk of delinquency” or “population growth causes technology change” or “the existence of paramilitary organizations contributed to the rise of German fascism”? What sorts of things can function as “social causes” — events, structures, actions, forces, other? What social interactions extend over time in the social world to establish the links between cause and effect? What kinds of evidence are available to support the claim that “social factor X causes a change in social factor Y”?

Helen Beebee, Christopher Hitchcock, and Peter Menzies’ The Oxford Handbook of Causation is a valuable resource on topics involving the philosophy of causation, and several of the contributions are immediately relevant to current debates within the philosophy of social science.

Harold Kincaid considers a number of the hard questions about social causation in his contribution to the Handbook, “Causation in the Social Sciences”. Perhaps most relevant … is his defense of non-reductionist claims about social causation. It is often maintained (by methodological individualists) that causal relations exist only among individuals, not among higher-level social entities or structures … Kincaid rejects this view and affirms the legitimacy of macro- or meso-level causal assertions.

“When a particular corporation acts in a market, it has causal influence. The influence of that specific entity is realized by the actions of the individuals composing it just as the influence of the baseball on the breaking window is realized by the sum of particles composing it. The social level causal claims pick out real causal patterns as types that may not be captured by individual kinds because multiple realizability is real.”

These arguments are a valuable antidote to the tendency towards reductionism to the level of individual activity that has often guided philosophers when they have considered the nature of social causation.

Phil Dowe’s discussion of causal process theories is useful for the social sciences (“Causal Process Theories”). It is hard to think of the social world as an amalgam of discrete events; it is easier to think of a variety of processes unfolding, subject to a range of forces and obstacles …

The language of causal processes seems to fit the nature of social causation better than that of events and systems of billiard balls. And we have the makings of a metaphysics of process available in the social sciences, in the form of a stream of actions and reactions of individuals aggregating to recognizable social patterns. So when we say that “population increase stimulates technology innovation”, we can picture the swarming series of interactions, demands, and opportunities that flows from greater population density, to rewards for innovation, to a more rapid rate of innovation.

Understanding Society

(h/t Tom Hickey)

Swedish housing bubble

2 July, 2013 at 11:16 | Posted in Economics, Politics & Society | 8 Comments

dragonOne of Stockholm’s most popular attractions is a guided tour of the Soedermalm district streets featured in Stieg Larsson’s bestselling book “The Girl With The Dragon Tattoo.” Buying a home in the former working-class neighborhood is far less accessible.

A one-bedroom, 55-square meter (592-square feet) apartment in Hoegalidsgatan, in the neighborhood where Larsson’s troubled heroine Lisbeth Salander grew up, sold last month for 3.75 million kronor ($569,000), 17 percent above the listing price, after a bidding war involving nine parties.

That level of demand is typical in the Swedish capital, where a shortage of construction, a population boom and mortgage rates below 3 percent have pushed prices in central Stockholm up 35 percent since early 2009. Borrowing for home purchases has in turn fueled record household debt across the country. That’s sparking concern among policy makers over potential damage to the economy and preventing the central bank from cutting rates, even as Sweden’s exporters say action must be taken to weaken the currency to protect thousands of jobs.

“If you combine low home construction with good access to financing, prices will rise,” Riksbank Governor Stefan Ingves said in an interview in Stockholm on May 27. Sweden needs to “in one way or another increase the supply of homes since we’ve had very low home construction for a very long time,” said Ingves, who’s warned about the risks of rising household debt.

With the three-month home loan rate at Sweden’s largest mortgage lender Swedbank AB (SWEDA) at 2.94 percent, its lowest level since November 2010, apartment prices across the country have jumped 11 percent in the past 12 months …

That’s left the Riksbank struggling to contain the growth in debt, which has reached a record average of 174 percent of disposable incomes. The International Monetary Fund last month recommended Sweden increase mortgage restrictions to prevent a housing bubble and stop consumer debt from spiraling out of control. At the current pace of amortization, Swedish households will need 140 years on average to repay their home loans …

A government report published in December last year concluded that Swedish law, including regulation on rent levels that sets similar rents on apartments of similar models with little consideration to their geographical location, has contributed to the lack of rental housing in the country.

“The market conditions that lead to uncertain profitability calculations have inhibited willingness to invest in rental housing even in areas with high demand and have made other investment options more attractive,” it said. “It has only been possible to realize the land-value through conversions from rental accommodation to tenant-owned housing” …

At the Riksbank, soaring debt levels, income inequality and home prices aren’t the only concerns. Sweden’s export-reliant economy is also suffering from slumping demand in the wake of the debt crisis in Europe, to which the country ships about 70 percent of its exports. Exacerbating the problem is the krona, which has soared 26 percent against the euro since March 2009 after Sweden became a haven from Europe’s fiscal woes …

So far, the Riksbank has argued it has little choice but to keep interest rates at 1 percent, given the potential effect a cut would have on household debt and property prices.

While measures such as capping mortgages at 85 percent of a property’s value have helped slow borrowing growth from levels above 10 percent between 2004 and 2008, household lending still expanded by an annual 4.6 percent in April. The Riksbank forecasts that private debt as a percentage of disposable incomes will jump to 177 percent in early 2015 …

Swedes aren’t showing much indication they’re losing their appetite for buying property. A survey by SEB on June 10 showed that 55 percent believe house prices will rise in the coming year, while 17 percent said they expect a decline …

Bloomberg

This interesting and worrying report confirms what yours truly wrote back in April on this blog.

The increase in house loans – and house prices – in Sweden has for many years been among the steepest in the world.

Sweden’s house price boom started in mid-1990s, and looking at the development of real house prices since 1986, there are reasons to be deeply worried:

Source: Statistics Sweden and own calculations

The indebtedness of the Swedish household sector has also risen to alarmingly high levels, as can be seen in the figure below (based on new data published earlier this month by Statistics Sweden, showing the development of household debts/disposable income 1990 – 2012):
householsdebts

Source: Statistics Sweden and own calculations

Yours truly has been trying to argue – for two years now – with “very serious people” that it’s really high time to “take away the punch bowl.” Mostly I have felt like the voice of one calling in the desert, and up until now neither the Swedish central bank, nor the government, has been willing to listen. Compairing the above figures with the one below (source) could perhaps give some refreshing perspective …
a

Why evidence doesn’t always travel – not even with RCT tickets!

1 July, 2013 at 19:54 | Posted in Theory of Science & Methodology | Comments Off on Why evidence doesn’t always travel – not even with RCT tickets!

RCTs [randomised controlled trials] are often advocated by people who do not like theory. They think our claims to theoretical knowledge are too slippery, they just do not want to trust to them. So they resist claims like mine that it takes a large and varying mix of methods and background knowledge together, including a good deal of theory, to warrant the information necessary to use success of a policy in a study as evidence for success in a new setting. They have an alternative proposal: more and more RCTs, with as much variation in circumstances as possible. I agree that more RCTs, especially across a variety of circumstances, can improve the warrant for an effectiveness prediction. They do so by supporting the assumption that the policy can play a positive causal role here in our new setting. How? That’s the rub. The argument could be by simple enumerative induction: swan 1 is white, swan 2 is white…, so all swans are white; x can play a positive causal role in situation 1, x can play a positive causal role in situation 2…, so x can play appositive causal role everywhere.

nancy cartwrightHow good is that argument? For a good induction we need not only a large and varied inductive base – lots of swans from lots of places, lots of RCTs from different populations and different settings. We also need reason to believe that the observations are projectable, plus an account of the range across which they project. A feature is projectable across a population of instances if there is something about each instance in the population that ensures that they must all be the same with respect to that feature … [M]any causal connections depend on intimate, complex interactions among factors present so that no special role for the factor of interest can be prised out and projected to new situations. Thus, it cannot be taken for granted that causal connections are projectable – that what happens in one setting is what will always happen …

We may have results of some good studies. But that does not constitute strong evidence for our conclusion if we do not also have good reasons in support of the additional assumptions about causal roles that it takes to make those study results relevant generally or at least to the new settings in view.

Nancy Cartwright

« Previous Page

Blog at WordPress.com.
Entries and comments feeds.