Alliansregeringens stora misslyckande

21 Mar, 2014 at 17:27 | Posted in Economics, Politics & Society | Comments Off on Alliansregeringens stora misslyckande

 
as

Brownian motion simulation in Excel (student stuff)

20 Mar, 2014 at 16:26 | Posted in Statistics & Econometrics | Comments Off on Brownian motion simulation in Excel (student stuff)

 

Statistical power analysis (student stuff)

20 Mar, 2014 at 15:47 | Posted in Statistics & Econometrics | Comments Off on Statistical power analysis (student stuff)

 

Ascent

19 Mar, 2014 at 09:27 | Posted in Varia | Comments Off on Ascent

 

[h/t Greger]

Trickle-up economics

18 Mar, 2014 at 16:09 | Posted in Economics, Politics & Society | 2 Comments

1aquote-trickle-down-fraud

Inequality continues to grow all over the world — so don’t even for a second think that this is only an American problem!

In case you think — like e. g. Paul Krugman — that it’s different in my own country — Sweden — you should take a look at some new data from Statistics Sweden and this (Swedish) video.

The Gini coefficient is a measure of inequality (where a higher number signifies greater inequality) and for Sweden we have this for the disposable income distribution:
SwedenGini1980to2011            Source: Statistics Sweden and own calculations

What we see happen in the US and Sweden is deeply disturbing. The rising inequality is outrageous – not the least since it has to a large extent to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Societies where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implode. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed. It’s high time to put an end to this the worst Juggernaut of our time!

Simple logistic regression (student stuff)

18 Mar, 2014 at 14:58 | Posted in Statistics & Econometrics | Comments Off on Simple logistic regression (student stuff)

 

 
And in the video below (in Swedish) yours truly shows how to perform a logit regression using Gretl.
 

Libertarian magic dust

18 Mar, 2014 at 13:00 | Posted in Politics & Society | Comments Off on Libertarian magic dust

 

Searching for causality — statistics vs. history

17 Mar, 2014 at 11:37 | Posted in Theory of Science & Methodology | 4 Comments

History and statistics serve a common purpose: to understand the causal force of some phenomenon. It seems to me, moreover, that statistics is a simplifying tool to understand causality, whereas history is a more elaborate tool. And by “more elaborate” I mean that history usually attempts to take into account both more variables as well as fundamentally different variables in our quest to understand causality.

einstein-relativityTo make this point clear, think about what a statistical model is: it is a representation of some dependent variable as a function of one or more independent variables, which we think, perhaps because of some theory, have a causal influence on the dependent variable in question. A historical analysis is a similar type of model. For example, a historian typically starts by acknowledging some development, say a war, and then attempts to describe, in words, the events that led to the particular development. Now, it is true that historians typically delve deeply into the details of the events predating the development – e.g., by examining written correspondence between officials, by reciting historical news clippings to understand the public mood, etc. – but this simply means that the historian is examining more variables than the simplifying statistician. If the statistician added more variables to his regression, he would be on his way to producing a historical analysis.

There is, however, one fundamental way in which the historian’s model is different from the statistician’s: namely, the statistician is limited by the fact that he can only consider precisely quantified variables in his model. The historian, in contrast, can add whatever variables he wants to his model. Indeed, the historian’s model is non-numeric …

It is my view that what differentiates whether history or statistics will be successful relates to the subject area to which each tool is applied. In subjects where precisely quantified variables are all we need to confidently determine the causal force of some phenomenon, statistics will be preferable; in subjects where imprecisely quantified variables play an important causal role, we need to rely on history.

It seems to me, moreover, that the line dividing the subjects to which we apply our historical or statistical tools cuts along the same seam as does the line dividing the social sciences from the natural sciences. In the latter, we can ignore imprecisely quantified variables, such as human beliefs, as these variables don’t play an important causal role in the movement of natural phenomena. In the former, such imprecisely quantified variables play a central role in the construction and the stability of the laws that govern society at any given moment.

Econolosophy

On the limits of randomization

16 Mar, 2014 at 18:23 | Posted in Statistics & Econometrics | 1 Comment

In the video below, Angus Deaton — Professor of International Affairs and Professor of Economics and International Affairs at the Woodrow Wilson School and the Economics Department at Princeton — explains why using Randomized Controlled Trials (RCTs) is not at all the “gold standard” that it has lately often been portrayed as. control-group1-2As yours truly has repeatedly argued on this blog (e.g. here and here), RCTs usually do not provide evidence that their results are exportable to other target systems. The almost religious belief with which its propagators portray it, cannot hide the fact that RCTs cannot be taken for granted to give generalizable results. That something works somewhere is no warranty for it to work for us or even that it works generally.

Elegy

15 Mar, 2014 at 20:42 | Posted in Varia | Comments Off on Elegy

 

The pretense-of-knowledge syndrome in economics

15 Mar, 2014 at 16:15 | Posted in Economics | 6 Comments

What does concern me about my discipline … is that its current core — by which I mainly mean the so-called dynamic stochastic general equilibrium approach — has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one …

While it often makes sense to assume rational expectations for a limited application to isolate a particular mechanism that is distinct from the role of expectations formation, this assumption no longer makes sense once we assemble the whole model. Agents could be fully rational with respect to their local environments and everyday activities, but they are most probably nearly clueless with respect to the statistics about which current macroeconomic models expect them to have full information and rational information.

pretenceThis issue is not one that can be addressed by adding a parameter capturing a little bit more risk aversion about macro-economic, rather than local, phenomena. The reaction of human beings to the truly unknown is fundamentally different from the way they deal with the risks associated with a known situation and environment … In realistic, real-time settings, both economic agents and researchers have a very limited understanding of the mechanisms at work. This is an order-of-magnitude less knowledge than our core macroeconomic models currently assume, and hence it is highly likely that the optimal approximation paradigm is quite different from current workhorses, both for academic andpolicy work. In trying to add a degree of complexity to the current core models, by bringing in aspects of the periphery, we are simultaneously making the rationality assumptions behind that core approach less plausible …

The challenges are big, but macroeconomists can no longer continue playing internal games. The alternative of leaving all the important stuff to the “policy”-typ and informal commentators cannot be the right approach. I do not have the answer. But I suspect that whatever the solution ultimately is, we will accelerate our convergence to it, and reduce the damage we do along the transition, if we focus on reducing the extent of our pretense-of-knowledge syndrome.

Ricardo J. Caballero

A great article that also underlines — especially when it comes to forecasting and implementing economic policies  — that the future is inherently unknowable, and using statistics, econometrics, decision theory or game theory, does not in the least overcome this ontological fact.

It also further underlines how important it is in social sciences — and economics in particular — to incorporate Keynes’s far-reaching and incisive analysis of induction and evidential weight in his seminal A Treatise on Probability (1921).

treatprobAccording to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but “rational expectations.” Keynes rather thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief,” beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modeled by “modern” social sciences. And often we “simply do not know.”

How strange that social scientists and mainstream economists as a rule do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical “probability.” In the quest for measurable quantities one puts a blind eye to qualities and looks the other way.

So why do economists, companies and governments continue with the expensive, but obviously worthless, activity of trying to forecast/predict the future?

A couple of months ago yours truly was interviewed by a public radio journalist working on a series on Great Economic ThinkersWe were discussing the monumental failures of the predictions-and-forecasts-business. But — the journalist asked — if these cocksure economists with their “rigorous” and “precise” mathematical-statistical-econometric models are so wrong again and again — why do they persist wasting time on it?

In a discussion on uncertainty and the hopelessness of accurately modeling what will happen in the real world — in M. Szenberg’s Eminent Economists: Their Life Philosophies — Nobel laureate Kenneth Arrow comes up with what is probably the most plausible reason:

It is my view that most individuals underestimate the uncertainty of the world. This is almost as true of economists and other specialists as it is of the lay public. To me our knowledge of the way things work, in society or in nature, comes trailing clouds of vagueness … Experience during World War II as a weather forecaster added the news that the natural world as also unpredictable. cloudsAn incident illustrates both uncer-tainty and the unwilling-ness to entertain it. Some of my colleagues had the responsi-bility of preparing long-range weather forecasts, i.e., for the following month. The statisticians among us subjected these forecasts to verification and found they differed in no way from chance. The forecasters themselves were convinced and requested that the forecasts be discontinued. The reply read approximately like this: ‘The Commanding General is well aware that the forecasts are no good. However, he needs them for planning purposes.’

Der Himmel über Berlin

14 Mar, 2014 at 16:53 | Posted in Varia | 1 Comment

 

One of my favourite movies. Absolutely fabulous.

Friskolor — selektion och segregering

14 Mar, 2014 at 10:52 | Posted in Education & School | Comments Off on Friskolor — selektion och segregering

När landets grundskolor grupperas efter elevernas socioekonomiska bakgrund är det uppenbart att elevunderlaget skiljer sig markant mellan fristående och kommunala skolor …

Genom att dela in landets skolor i tio lika stora socio-ekonomiska grupper och undersöka andelen friskolor i var och en av dessa framträder … ett tydligt mönster: ju gynnsammare socio-ekonomisk elevsammansättning, desto större andel fristående skolor …

Incitamenten att lämna det offentliga skolsystemet kan av flera skäl vara starkast för socialt starka elever. Om friskolorna till exempel ersätts efter kommunens genomsnittliga skolutgifter kan socio-ekonomiskt starka elever genom att byta till en fristående skola slippa den omfördelning till mindre socialt gynnade elever som vanligtvis sker. Detta skulle även kunna påverka utbudet eftersom fristående skolor som lyckas locka till sig socialt starka elevsegment i sådana fall överkompenseras jämfört med de kommunala skolorna.

lipsillAtt den sociala selektionen av elever är tydligare bland icke-vinstdrivande än bland vinstdrivande friskolor tyder på att efterfrågan spelar en viktig roll. Samtidigt är mönstret tydligt även bland vinstdrivande skolor vilket kan bero på att vinstmarginalerna är större för dessa elevsegment. En bidragande orsak kan vara att koncernskolor som är måna om sitt varumärke kan vara ovilliga att öppna skolor i socialt utsatta områden eftersom sådana skolor sannolikt kommer att uppvisa relativt svaga resultat, även om skolan skulle göra ett bra jobb. Det skulle i så fall inte vara en slump att just Academedia, som driver skolor under olika varumärken, har något fler skolor i socialt svaga grupper än de övriga koncernerna.

Jonas Vlachos

What is neoclassical economics?

14 Mar, 2014 at 09:10 | Posted in Economics | 4 Comments

For your edification, I offer this link to an elegant explanation of why neoclassical economics presents itself as purely scientific and denies any ideological commitments, and strangles pluralism.

a

In brief: Arnsperger and Varoufakis define “neoclassical” economics in terms of three “meta-axioms.” First, neoclassicism assumes “methodological individualism,” i.e. that economists must ultimately posit individuals’ behaviors as the root cause of broad economic phenomena. Second, it assumes “methodological instrumentalism,” i.e. that these actors are somehow or other acting instrumentally in pursuit of goals, are “irreversibly ends-driven.” Third, it assumes “methodological equilibration,” i.e. rather than asking whether or under what conditions shall a state of affairs continue unchanged, it seeks to show that if equilibrium occurs, then it will endure.

The big twist of Ansperger and Varoufakis’ argument is that by keeping these assumptions well-hidden and unquestioned, neoclassicism simultaneously guts its own ability to effectively explain and predict real-world economic phenomena AND expands its own discursive authority.

The real genius of this article is in demonstrating how this paradoxical circumstance occurs. They carefully and explicitly reject the view that economics professors are cynically and purposively responsible as a “conspiracy theory.” Instead, they pursue a “functionalist” explanation (which seems like maybe the defining characteristic of science itself: showing how cause and effect, independent of any overarching purpose, lead from situation A to situation B), which boils down to funding sources. Basically, they claim that economists who pursue technical elaborations, “who simply ‘get on with the job,’” get funding while those who raise important but non-actionable questions about assumptions, method, and framework do not. “No one wants to keep quiet on the meta-axioms. They are just too busy building magnificent edifices on top of them, and being magnificently rewarded for it” …

So the three meta-axioms of neoclassical economics define the language and concepts which can/must be invoked by any economist who wishes to be taken seriously. By presenting as self-evident and obvious, they effectively make themselves invisible while also precluding alternative approaches.

Casey Jaywork

[h/t Mark Buchanan]

For my own take on this issue, see, e.g., here and here.

Time to rewrite the textbooks on money creation

12 Mar, 2014 at 19:13 | Posted in Economics | 6 Comments

strip1

This article has discussed how money is created in the modern economy. Most of the money in circulation is created, not by the printing presses of the Bank of England, but by the commercial banks themselves: banks create money whenever they lend to someone in the economy or buy an asset from consumers. And in contrast to descriptions found in some textbooks, the Bank of England does not directly control the quantity of either base or broad money. The Bank of England is nevertheless still able to influence the amount of money in the economy. It does so in normal times by setting monetary policy — through the interest rate that it pays on reserves held by commercial banks with the Bank of England. More recently, though, with Bank Rate constrained by the effective lower bound, the Bank of England’s asset purchase programme has sought to raise the quantity of broad money in circulation. This in turn affects the prices and quantities of a range of assets in the economy, including money.

Michael McLeay, Amar Radia and Ryland Thomas of Bank of England’s Monetary Analysis Directorate

(h/t Phil Pilkington)

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.